Data Engineer, Python, PySpark, AWS Glue, Amazon Athena, SQL, Apache Airflow

Job not on LinkedIn

September 11

Apply Now
Logo of PrideLogic

PrideLogic

PrideLogic is a company that does not currently have detailed information available as their website is under construction. More details may be provided in the future once updates are made available on their site.

11 - 50 employees

📋 Description

• Build, optimize, and scale data pipelines and infrastructure using Python, TypeScript, Apache Airflow, PySpark, AWS Glue, and Snowflake • Design, operationalize, and monitor ingest and transformation workflows: DAGs, alerting, retries, SLAs, lineage, and cost controls • Collaborate with platform and AI/ML teams to automate ingestion, validation, and real-time compute workflows; work toward a feature store • Integrate pipeline health and metrics into engineering dashboards for full visibility and observability • Model data and implement efficient, scalable transformations in Snowflake and PostgreSQL • Build reusable frameworks and connectors to standardize internal data publishing and consumption • Act as a technical leader who solves complex problems, ships features quickly, and elevates the codebase

🎯 Requirements

• 4+ years of production data engineering experience • Deep, hands-on experience with Apache Airflow, AWS Glue, PySpark, and Python-based data pipelines • Strong SQL skills and experience operating PostgreSQL in live environments • Solid understanding of cloud-native data workflows (AWS preferred) and pipeline observability (metrics, logging, tracing, alerting) • Proven experience owning pipelines end-to-end: design, implementation, testing, deployment, monitoring, and iteration • Experience with Snowflake performance tuning, cost optimization (preferred) • Real-time or near-real-time processing experience (streaming ingestion, incremental models, CDC) (preferred) • Hands-on experience with a backend TypeScript framework (e.g., NestJS) (preferred) • Experience with data quality frameworks, contract testing, or schema management (e.g., Great Expectations, dbt tests, OpenAPI/Protobuf/Avro) (preferred) • Background in building internal developer platforms or data platform components (connectors, SDKs, CI/CD for data) (preferred) • English fluency (application questions ask about English proficiency)

🏖️ Benefits

• Fully remote position • Compensation paid in USD • Work hours aligned with EST (9 AM to 6 PM) or PT time zone

Apply Now

Similar Jobs

September 11

Python developer building data pipelines and automations for an Amazon-focused e-commerce company. Connect APIs, web scraping, Google Sheets, and deploy automations in AWS.

AWS

EC2

Pandas

Python

Selenium

August 28

Senior Backend Developer building Python/Django and Go services for a cybersecurity company. Designing scalable APIs, optimizing performance, and mentoring engineers.

AWS

BigQuery

Cloud

Cyber Security

Django

ElasticSearch

Google Cloud Platform

Linux

MySQL

Postgres

Python

Redis

Go

August 25

Lead backend engineer building scalable asset monitoring services for Motive's fleet operations platform.

AWS

Azure

Cassandra

Cloud

Distributed Systems

DynamoDB

Google Cloud Platform

Java

NoSQL

Postgres

Redis

Ruby

Ruby on Rails

Go

July 29

Join Blackscale Media as a Senior PHP Engineer to develop and maintain innovative digital solutions remotely from anywhere.

Docker

JavaScript

Kubernetes

Laravel

MongoDB

MySQL

PHP

Redis

Vue.js

February 21

As a Senior Java Developer, design and maintain applications for a client in the interpreting industry.

Angular

Docker

Java

Jenkins

Kubernetes

Microservices

MySQL

Oracle

Postgres

React

Spring

Spring Boot

SpringBoot

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com