Mid Data Engineer - Data Engineer II

Job not on LinkedIn

July 11

Apply Now
Logo of Volkswagen Group

Volkswagen Group

Transport • Automotive • Manufacturing

Volkswagen Group is a globally recognized automotive manufacturer that offers a wide range of vehicles and services through various brands. The company is dedicated to innovation in electric mobility, research and development, and production processes. With positions available around the world across numerous sectors within the automotive industry, Volkswagen Group focuses on providing exceptional career opportunities in a dynamic work environment.

10,000+ employees

🚗 Transport

📋 Description

• Uphold data quality, accuracy, and consistency across all enterprise data assets and sustain comprehensive data governance policies and procedures. • Create and maintain data documentation, including data dictionaries, lineage documentation, and metadata management, using industry best practices. • Work together with business stakeholders to understand data requirements and translate business needs into technical data models and solutions. • Develop, build, and maintain robust and scalable data pipelines and ETL processes using Data Vault methodology for enterprise data integration. • Develop and optimize data architectures, ensuring data quality, integrity, and accessibility across all organizational systems. • Create and maintain data models and translate stakeholder requirements into comprehensive data structures. • Apply data lineage tracking and documentation to ensure transparency in data flow and transformations throughout the organization. • Track data usage patterns and provide support for data systems to promote reliable performance, reliability, and compliance with established governance policies.

🎯 Requirements

• At least 3 years of proven experience as a Data Engineer or in a similar role, like Data Steward. • Data modelling experience for conceptual, logical, and physical data model design and implementation. • Data Vault methodology expertise for scalable data warehouse architecture and dimensional modeling techniques. • Data governance frameworks experience including compliance requirements, data quality management, profiling, and monitoring processes. • Python programming knowledge (2-3 years minimum) for data processing, automation, and data quality validation scripts. • Proficient in SQL and relational databases such as Oracle, PostgreSQL, MySQL with advanced query optimization skills. • Experience with ETL processes and data pipeline orchestration tools like Airflow or similar workflow management systems. • Strong business acumen with proven ability to translate complex business requirements into technical solutions and communicate with non-technical stakeholders. • Knowledge of dbt is a plus. • Knowledge of SAP logistics is a plus. • Knowledge of ERwin is a plus. • Eagerness to learn and adapt to new technologies and methodologies. • Manage code via Git (Github). • Ability to work in a team with Agile methodologies. • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. • Good level of English is a MUST (B2 minimum).

🏖️ Benefits

• Fully remote work capability, with an option to work from our office when needed. • Access to professional development tools and free language courses. • Flexible working hours to accommodate personal and professional needs. • A competitive holiday package and access to a variety of employee discounts.

Apply Now

Similar Jobs

June 23

Join Plain Concepts as a Data Engineer to develop innovative data solutions remotely.

AWS

Azure

Cloud

NoSQL

Python

Scala

Spark

SQL

May 12

Drive data engineering initiatives as a Data Engineering Lead at seQura. Foster team growth and ensure data reliability in a fintech environment.

Airflow

Amazon Redshift

AWS

Azure

BigQuery

Cloud

ETL

Google Cloud Platform

Kafka

Python

SQL

Terraform

February 5

Seeking a Data Engineer to design and maintain data pipelines at a cutting-edge property management platform in Spain.

Airflow

Apache

AWS

Azure

Cloud

ETL

Google Cloud Platform

Hadoop

Kafka

Python

Spark

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com