
Transport • Automotive • Manufacturing
Volkswagen Group is a globally recognized automotive manufacturer that offers a wide range of vehicles and services through various brands. The company is dedicated to innovation in electric mobility, research and development, and production processes. With positions available around the world across numerous sectors within the automotive industry, Volkswagen Group focuses on providing exceptional career opportunities in a dynamic work environment.
10,000+ employees
🚗 Transport
June 17
Airflow
Amazon Redshift
BigQuery
Docker
ETL
Grafana
Kubernetes
MySQL
NoSQL
OpenShift
Oracle
Postgres
Prometheus
Python
Redis
Spark
SQL

Transport • Automotive • Manufacturing
Volkswagen Group is a globally recognized automotive manufacturer that offers a wide range of vehicles and services through various brands. The company is dedicated to innovation in electric mobility, research and development, and production processes. With positions available around the world across numerous sectors within the automotive industry, Volkswagen Group focuses on providing exceptional career opportunities in a dynamic work environment.
10,000+ employees
🚗 Transport
• Design, build, and maintain robust and scalable data pipelines and ETL processes to support data integration and transformation. • Develop and optimize data architectures, ensuring data quality, integrity, and accessibility. • Collaborate with data scientists and analysts to understand data needs and provide efficient data solutions. • Implement data governance and security measures to protect sensitive information. • Monitor and troubleshoot data systems to ensure optimal performance and reliability. • Work with large datasets, employing advanced data processing techniques and technologies. • Stay up-to-date with the latest industry trends and technologies in data engineering and recommend improvements.
• Advanced knowledge of Python programming language (at least 4-5 years) • Proficient on noSQL DB like Druid, Redis, Redshift or BigQuery. Strong skills on SQL and relational databases such as Oracle, PostgreSQL, mySQL. • Proficient on ETLs, high-performance data pipelines and orchestration engines like Argo workflow or Airflow. • Experienced on Spark and other data engineering tools like Trino. • Proficient in Docker, OpenShift, Kubernetes, Helm. • Experience with setting up, maintaining and monitoring CI/CD pipelines (Bamboo) and applications (Grafana, Prometheus, Traceroute). • Knowledge about testing strategies and tooling like TDD, unit testing, integration testing, PyTest etc. • Eagerness to learn and adapt to new technologies and methodologies. • Strong communication skills in English and ability to work in a team with Agile methodologies.
• Fully remote work capability, with an option to work from our office when needed. • Access to professional development tools and free language courses. • Flexible working hours to accommodate personal and professional needs. • A competitive holiday package and access to a variety of employee discounts.
Apply NowMay 12
Drive data engineering initiatives as a Data Engineering Lead at seQura. Foster team growth and ensure data reliability in a fintech environment.
February 5
Seeking a Data Engineer to design and maintain data pipelines at a cutting-edge property management platform in Spain.