Ads Data Engineer

November 20

Apply Now
Logo of Hopper

Hopper

Travel • Technology

Hopper is a travel booking platform that offers a one-stop solution for planning trips, including flights, hotels, rental cars, and homes. The platform features advanced options such as price prediction, disruption assistance, and the Price Freeze™ feature, allowing travelers to secure deals and manage changes conveniently. Hopper is known for its mobile app availability on iOS and Android, providing a streamlined experience for its users. It serves millions of travelers worldwide, offering VIP support and various affiliate programs for partners.

201 - 500 employees

💰 $96M Venture Round on 2022-11

📋 Description

• Own the Data Architecture: Design, build, and maintain scalable and reliable ETL/ELT pipelines to process high-volume advertising data. • Build the Foundation: Develop and manage our analytical data warehouse, establishing it as the single source of truth for all reporting and analytics. • Enable Insights: Create clean, reliable, and performant data models that power our advertiser reporting dashboards, internal analytics, and billing systems. • Ensure Data Integrity: Implement robust data quality checks, monitoring, and alerting to ensure the accuracy and trustworthiness of our data. • Collaborate and Empower: Work closely with the engineering and product teams to define data requirements and deliver the necessary data infrastructure to support new ad products and features. • Prepare for the Future: Build the foundational data systems that will enable future ML-driven optimizations for audience targeting and performance prediction.

🎯 Requirements

• 4+ years of data engineering experience, with a demonstrated track record of building and maintaining data infrastructure at scale. • Expert-level proficiency in SQL and a programming language like Python or Scala for data processing. • Hands-on experience with cloud-based data warehousing solutions (e.g., BigQuery, Snowflake, Redshift). • Proven success building and operationalizing data pipelines using orchestration tools (e.g., SQLMesh, Airflow, dbt). • Experience with real-time data streaming technologies (e.g., Google Pub/Sub, Kafka, Kinesis). • Strong understanding of data modeling concepts and experience designing schemas for analytical workloads. • Experience with ad tech, retail media, or large-scale data systems is strongly preferred. • Excellent communication skills and an ability to collaborate effectively with both technical and business stakeholders. • A strong sense of ownership and the ability to operate with a high degree of autonomy in a fast-paced, entrepreneurial culture.

🏖️ Benefits

• Well-funded and proven startup with large ambitions, competitive salary and the upsides of pre-IPO equity packages. • Unlimited PTO. • Carrot Cash travel stipend. • Access to co-working space on demand through FlexDesk AND Work-from-home stipend. • Please ask us about our very generous parental leave, much above industry standards!. • Entrepreneurial culture where pushing limits and taking risks is everyday business. • Open communication with management and company leadership. • Small, dynamic teams = massive impact. 100% employer paid Medical, Dental and Vision coverage for employees. • Access to Disability & Life insurance. • Health Reimbursement Account (HRA). • DCA/ FSA and access to 401k plan.

Apply Now

Similar Jobs

November 20

Senior Data Engineer developing advanced analytics programs within Drata's data platform team. Collaborating on challenges related to extensive data pipeline optimization and system performance.

🇺🇸 United States – Remote

💵 $139.6k - $215.5k / year

💰 $100M Series B on 2021-11

⏰ Full Time

🟠 Senior

🚰 Data Engineer

Amazon Redshift

AWS

BigQuery

ETL

Java

Kafka

NoSQL

Python

Spark

SQL

November 20

Lead Anaplan & Data Architect driving financial planning and analytics strategy at Lime. Collaborate across teams to optimize data strategy and support Finance with insights.

Python

SQL

Tableau

November 20

Public Health Data Architect at BME Strategies leading technical design and architecture for statewide data modernization initiative. Collaborating across business, epidemiology, and IT domains.

AWS

Azure

Cloud

ETL

Informatica

Maven

SQL

November 20

Senior Data Engineer joining Agility Robotics to build data infrastructure for humanoid robots. Collaborating on data curation, ETL pipelines, and cross-team data products in a remote setup.

Airflow

Apache

AWS

Cloud

ETL

Java

Python

Scala

Spark

November 20

Sidekick Health

201 - 500

Senior Data Engineer focusing on building scalable data pipelines and ensuring compliance at Sidekick Health. Working remotely from or near Minneapolis, collaborating with a global team primarily in Europe.

AWS

BigQuery

Cloud

ETL

Kubernetes

NoSQL

Python

SQL

Terraform

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com