Data Engineer

November 19

Apply Now
Logo of Darkroom

Darkroom

Marketing • B2B • SaaS

Darkroom is a world-class growth marketing company specializing in services that span the entire customer journey. As the 385th fastest growing private company in America according to the Inc. 5000 in 2023, Darkroom focuses on maximizing ROI for its clients. They offer services such as Amazon Marketplace Management, Creative Services, Paid Media Management, Retention Marketing, and Website Optimization. With a strong emphasis on integrating finance, creativity, and performance into growth marketing strategies, Darkroom leverages data infrastructure and attribution to enhance advertising effectiveness and profitability. Their committed approach has created over $1 billion in attributable revenue for clients, making them a trusted partner for leading companies worldwide.

51 - 200 employees

🤝 B2B

☁️ SaaS

📋 Description

• Architect, build, and optimize scalable end-to-end data pipelines • Design, implement, and maintain ETL/ELT processes • Collaborate with product and engineering to define data requirements • Manage data pipeline by onboarding new clients • Maintain and troubleshoot workflows • Ensure data reliability, integrity, and security at scale. • Troubleshoot, performance-tune, and document all pipelines and data workflows

🎯 Requirements

• 5+ years of hands-on experience building production-grade data pipelines • Expert-level Python • Strong experience with dbt, Postgres, BigQuery, and core GCP data services • Familiarity with analytics, BI tools, and data visualization platforms • Excellent communication skills • Low-ego, high-ownership, and passionate about building product-focused solutions.

🏖️ Benefits

• Health insurance • Retirement plans • Flexible work arrangements • Professional development • Stock options

Apply Now

Similar Jobs

November 19

Data Engineering Practice Lead responsible for Microsoft Fabric architecture and strategy. Leading data migrations while mentoring the engineering team towards modern data practices.

October 29

Data Engineer focused on improving e-commerce analytics and insights within the Merchant Intelligence Team. Using BI tools, SQL and Python for real-time data solutions at Constructor.

Docker

Numpy

Pandas

PySpark

Python

Spark

SQL

October 10

Developing end-to-end automation of data pipelines for the SIXT Data platform. Exploring AWS and big data technologies to enable self-service data for users.

Airflow

Amazon Redshift

Apache

AWS

Cloud

Distributed Systems

EC2

Jenkins

Python

SQL

Terraform

October 3

Data Engineer focusing on improving data pipelines for Veeva Link, a life sciences cloud solution. Join a mission-driven organization supporting rapid therapy development.

Apache

AWS

Cloud

Google Cloud Platform

Java

PySpark

Python

Spark

September 11

Senior Data Engineer building AI-focused data models and pipelines at Wellhub corporate wellness platform. Process multimodal user data to improve recommendations and product quality.

Airflow

Cloud

Kafka

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com