Data Engineer

Job not on LinkedIn

September 22

🗣️🇩🇪 German Required

Apply Now
Logo of paiqo

paiqo

We help our customers digitize their businesses in the field of data platform and artificial intelligence.

11 - 50 employees

📋 Description

• Build and operate modern data platforms with focus on data engineering • Develop scalable data pipelines and ETL/ELT processes on Microsoft Azure • Develop stable batch and streaming pipelines with Azure Data Factory and Databricks/Spark • Ensure reliability using best practices such as Delta Lake, automated workflows and VNet-securing • Take responsibility for sub-projects in data integration, e.g. set up data path from operational systems to a data lakehouse • Implement and optimize ETL/ELT routes and ensure data quality • Use Azure data services for storage and processing (Data Lake, Azure SQL, Databricks, MS Fabric) • Participate in setting up CI/CD pipelines (Azure DevOps or GitHub) for automated deployments • Collaborate closely with Data Scientists and Analytics teams to provide data for analytics and ML

🎯 Requirements

• 2-4 years of experience in data engineering or data platform development • Solid knowledge of SQL and programming (Python or Scala) • Experience with Azure Data Services (e.g. Azure Data Factory, Azure Databricks, Synapse) • Familiarity with data modeling (e.g. Star Schema, Kimball) • Experience in data platform monitoring & performance optimization • Experience with version management (Git) and DevOps practices (Continuous Integration, Infrastructure as Code) • Communication & Collaboration: working with data scientists, analysts and clients and translating technical concepts for non-technical people • Problem solving & analytical thinking: optimize data streams, identify bottlenecks and find creative solutions • Language skills: German fluent, English good

Apply Now

Similar Jobs

September 13

Kpler

201 - 500

⚡ Energy

🚗 Transport

Data Engineer building cargo-tracking models and back-end pipelines for Kpler's maritime trade intelligence.

🇩🇪 Germany – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

August 28

smartclip

501 - 1000

☁️ SaaS

📱 Media

Data Engineer – AI building scalable data backends and operationalising models into production at smartclip

🇩🇪 Germany – Remote

💰 Series A on 2008-12

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

August 20

AIPERIA

51 - 200

🤖 Artificial Intelligence

☁️ SaaS

🌾 Agriculture

Data Engineer collaborates with Operations and Data Science to evolve data models and pipelines. Fully remote within Germany, with WĂźrzburg or Berlin offices.

🇩🇪 Germany – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

🗣️🇩🇪 German Required

August 14

AIPERIA

51 - 200

🤖 Artificial Intelligence

☁️ SaaS

🌾 Agriculture

Collaborate with Data Engineering and Data Science to enhance data models and pipelines.\nEnable AI-powered sustainability in retail through scalable data infrastructure.

🇩🇪 Germany – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

🗣️🇩🇪 German Required

August 8

virtual7 GmbH

51 - 200

🔌 API

🛍️ eCommerce

Join virtual7 as a Data Warehouse ETL developer driving digitalization in the public sector.

🇩🇪 Germany – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

🗣️🇩🇪 German Required

Developed by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com