
Transport • B2C • B2B
SIXT is a global mobility company offering premium car rental, long-term rentals, car subscriptions, chauffeur services, and car-sharing through an integrated app and worldwide station network. With over 2,000 stations in 105+ countries and a diverse fleet of 222,000+ vehicles including luxury and specialty models, SIXT serves leisure and corporate customers with airport and city pick-up/return, business fleet solutions, and digital booking and loyalty features.
October 10

Transport • B2C • B2B
SIXT is a global mobility company offering premium car rental, long-term rentals, car subscriptions, chauffeur services, and car-sharing through an integrated app and worldwide station network. With over 2,000 stations in 105+ countries and a diverse fleet of 222,000+ vehicles including luxury and specialty models, SIXT serves leisure and corporate customers with airport and city pick-up/return, business fleet solutions, and digital booking and loyalty features.
• You develop the end-to-end automation of data pipelines to make ingestion, transformation, and distribution of data more efficient and self-service for our users • You explore and learn the latest AWS and big data technologies to uncover hidden use-cases, enable new capabilities, and build new integrations for our SIXT Data shop product • You interact with Data Engineers, BI Analysts and Data Scientists to find the best solutions for different analytical use cases (Dashboarding, ad hoc Analytics, Data-as-a-product, Machine Learning) • You contribute to the Data Platform vision and execution with experience, new ideas, and your own curiosity • You show extreme work ethics and integrity to deal with sensitive data while protecting our end-customer
• You hold a bachelor’s or master’s degree in computer science or related fields preferred • You have got experience with combined Data Lake and Data Warehouse architectures and automation from data sources to analytical dashboards with ELT • You are experienced in analytical cloud datawarehouses (Redshift, Snowflake) & you know how to create transformed data models (dbt) and to automate interdependent jobs (Apache Airflow) • You are experienced in working with AWS services in CLI and Console (S3, EC2, EMR, Lambda, Kinesis, IAM...) and infrastructure as code (Terraform, CloudFormation) • You work with engineering best practices across the development lifecycle, including agile methodologies, code reviews, source management (GitHub), deployment processes (Jenkins), testing, operations, etc. and you also have got an understanding of data management fundamentals, distributed systems, and data storage/compute principles • Knowing Python and SQL is a must
• Enjoy 28 days of vacation, an additional day off for your birthday, and 1 volunteer day per year • Benefit from a hybrid working model, flexible working hours, and no dress code • Access discounts on SIXT rent, share, ride, and SIXT+, along with partner discounts • Participate in training programs, external conferences, and internal dev & tech talks designed for your personal growth and development • Private health insurance to support your well-being • Enjoy the Coverflex advantage system to enhance your employee experience
Apply NowOctober 3
Data Engineer focusing on improving data pipelines for Veeva Link, a life sciences cloud solution. Join a mission-driven organization supporting rapid therapy development.
Apache
AWS
Cloud
Google Cloud Platform
Java
PySpark
Python
Spark
September 11
Senior Data Engineer building AI-focused data models and pipelines at Wellhub corporate wellness platform. Process multimodal user data to improve recommendations and product quality.
Airflow
Cloud
Kafka
August 28
1001 - 5000
Senior Data Engineer leading Lift & Shift Hadoop/Hive to Snowflake migration at Keyrus, developing Dataiku and Spark pipelines.
🗣️🇫🇷 French Required
Apache
AWS
Azure
Cloud
ETL
Google Cloud Platform
Hadoop
Informatica
Python
Scala
Spark
SQL
July 11
MoonPay is seeking a Senior Data Engineer to shape its data platform. Lead complex projects and mentor teams in a remote role.
Airflow
AWS
Azure
Cloud
Google Cloud Platform
Kafka
Kubernetes
Python
SQL
Terraform
Web3
June 20
Join Intermedia as a Data Engineer to modernize data platform using Snowflake expertise.
AWS
Azure
Cloud
ETL
Google Cloud Platform
MS SQL Server
SQL