Data Engineer

Job not on LinkedIn

September 13

Apply Now
Logo of Kpler

Kpler

Energy • Transport • Data Services

Kpler is a leading provider of real-time data and intelligence in the maritime and energy sectors. The company offers products and services designed to help businesses make informed decisions in commodity trading, energy transition, and maritime logistics. With a robust platform that provides detailed insights into physical markets, power markets, and financial markets, Kpler enables traders, analysts, fleet managers, and logistics professionals to optimize their operations and strategies. The company's tools include ship tracking, cargo inventories, and power market analytics, serving over 10,000 organizations worldwide.

201 - 500 employees

Founded 2014

⚡ Energy

🚗 Transport

📋 Description

• Work alongside data engineers, data scientists and product managers to develop and implement core algorithms and back-end data pipelines based on project requirements and design specifications. • Add new features and review pipelines and integrations with datastores to ensure the highest cargo tracking data quality for clients and internal users. • Help to optimise system performance, maintain features, troubleshoot issues, and ensure high availability. • Use advanced Operations Research and Data Science techniques to process and transform live data and integrate it into core data pipelines and cargo tracking models. • Manage and ensure the quality of the data provided to clients as part of the Cargo Models team. • Demonstrate strong analytical and debugging skills with a proactive approach to learning.

🎯 Requirements

• Have hands-on experience with Python. Experience with Flask or SQLALchemy is a plus. • Solid SQL skills for querying and managing relational databases. • Knowledge of streaming and big data technologies (such as Kafka or Spark). • Comfortable working with Git, code reviews, and Agile methodologies. • Have an understanding of containerisation and orchestration tools (e.g., Docker, Kubernetes). • Are eager to learn new languages and technologies. • Nice to have: Have worked with AWS (or another cloud provider), using Terraform. • Nice to have: Have experience with Scala or other JVM Languages. • Nice to have: Exposure to Elasticsearch.

Apply Now

Similar Jobs

August 28

smartclip

501 - 1000

☁️ SaaS

📱 Media

Data Engineer – AI building scalable data backends and operationalising models into production at smartclip

🇩🇪 Germany – Remote

💰 Series A on 2008-12

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

August 20

AIPERIA

51 - 200

🤖 Artificial Intelligence

☁️ SaaS

🌾 Agriculture

Data Engineer collaborates with Operations and Data Science to evolve data models and pipelines. Fully remote within Germany, with WĂźrzburg or Berlin offices.

🇩🇪 Germany – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

🗣️🇩🇪 German Required

August 14

AIPERIA

51 - 200

🤖 Artificial Intelligence

☁️ SaaS

🌾 Agriculture

Collaborate with Data Engineering and Data Science to enhance data models and pipelines.\nEnable AI-powered sustainability in retail through scalable data infrastructure.

🇩🇪 Germany – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

🗣️🇩🇪 German Required

August 8

virtual7 GmbH

51 - 200

🔌 API

🛍️ eCommerce

Join virtual7 as a Data Warehouse ETL developer driving digitalization in the public sector.

🇩🇪 Germany – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

🗣️🇩🇪 German Required

July 30

devop1

201 - 500

🤝 B2B

🎯 Recruiter

☁️ SaaS

Drive data architecture and processing improvements, mentoring teams in cloud automation services.

🇩🇪 Germany – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

Developed by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com