
We help our customers digitize their businesses in the field of data platform and artificial intelligence.
11 - 50 employees
September 22
đŁď¸đŠđŞ German Required

We help our customers digitize their businesses in the field of data platform and artificial intelligence.
11 - 50 employees
⢠Design end-to-end architectures for data platforms, establish self-service platforms and implement governance and cost optimization standards ⢠Select Azure technologies, conduct code reviews and coach the team ⢠Develop end-to-end data solutions on Azure from source system connection to processing and provision of data for BI/AI applications ⢠Select and integrate appropriate technologies (e.g. streaming with Kafka/Event Hubs, storage in Azure Data Lake/Azure SQL, processing with Spark) ⢠Implement best practices for coding, testing and monitoring data pipelines to ensure reliability and scalability ⢠Establish automated CI/CD workflows and monitor operational processes (monitoring, alerting) ⢠Work closely with customers and business stakeholders to gather requirements and translate them into technical solutions ⢠Participate in pre-sales and consulting activities
⢠5+ years of experience in data engineering / data platform development, including extensive practice on Microsoft Azure ⢠Knowledge of data architectures, distributed data processing systems and cloud services (Azure Data Factory, MS Fabric, Databricks, Event Hubs etc.) ⢠Experience in data security and governance - knows data protection requirements and implements access authorizations, encryption and compliance conformity ⢠Proven ability to achieve performance and cost optimization with large data volumes ⢠Proficient in DevOps methods: continuous delivery of data pipelines, infrastructure-as-code (e.g. Terraform/Bicep) and containerized deployments ⢠Strategic thinking & consulting: Ability to translate business requirements into data architectures and advise clients ⢠Leadership skills to technically lead a team and clear communication skills with customers (including non-technical contacts) ⢠Very good knowledge of German and English
Apply NowSeptember 13
Data Engineer building cargo-tracking models and back-end pipelines for Kpler's maritime trade intelligence.
AWS
Cloud
Docker
ElasticSearch
Flask
Kafka
Kubernetes
Python
Scala
Spark
SQL
Terraform
August 28
Data Engineer â AI building scalable data backends and operationalising models into production at smartclip
đŠđŞ Germany â Remote
đ° Series A on 2008-12
â° Full Time
đĄ Mid-level
đ Senior
đ° Data Engineer
Apache
AWS
Docker
ETL
Google Cloud Platform
Grafana
Hadoop
Java
JavaScript
Jenkins
Kubernetes
Linux
Node.js
Open Source
Prometheus
Python
React
Scala
Shell Scripting
Spark
SQL
TypeScript
August 20
Data Engineer collaborates with Operations and Data Science to evolve data models and pipelines. Fully remote within Germany, with WĂźrzburg or Berlin offices.
đŁď¸đŠđŞ German Required
ETL
Pandas
Python
August 14
Collaborate with Data Engineering and Data Science to enhance data models and pipelines.\nEnable AI-powered sustainability in retail through scalable data infrastructure.
đŁď¸đŠđŞ German Required
ETL
Pandas
Python
August 8
Join virtual7 as a Data Warehouse ETL developer driving digitalization in the public sector.
đŁď¸đŠđŞ German Required
ETL
Oracle
SQL