Data Engineer – AI

August 28

Apply Now
Logo of smartclip

smartclip

SaaS • Media • Technology

smartclip is a European company specializing in ad platforms and technology solutions for broadcasters and publishers. With its headquarters in Berlin, smartclip offers a variety of technology services including its main ad server and SSP platform called smartx. The company focuses on providing innovative and efficient advertising solutions, integrating advanced analytics to understand how ads are performing and to enhance user engagement. smartclip also has partnerships with technology vendors and offers various job opportunities in areas like software engineering, API development, and data engineering, allowing for remote work options. It aims to be a leader in ad technology by continuously evolving and adapting to new market demands and privacy regulations.

501 - 1000 employees

☁️ SaaS

📱 Media

💰 Series A on 2008-12

📋 Description

• Build the Backbone of AI That Matters • Own the architecture and operations of AI applications — from raw data to production-grade services • Work side by side with Data Scientists to make models actually usable, and drive the technical backbone of products that run at scale • Design, build & run: Architect scalable data backends to power AI-driven products • Bridge the gap: Turn prototypes into production alongside Data Scientists • Automate & optimize: Ensure stability and performance of AI APIs and make testing a non-event • Explore the new: Evaluate bleeding-edge open-source storage, compute, and orchestration tech • Tech-first environment: Work with TypeScript, Node.js, React, Python, SQL, Scala, Java, Docker, Kubernetes, AWS, GCP

🎯 Requirements

• Deep knowledge of software engineering, including testing and design patterns • Experience with data pipelines, storage, and processing techniques • A get-code-into-prod mentality — you care about robustness and performance • You thrive in agile, cross-functional teams where people ship fast and learn faster • Experience with Python, SQL, Git • Bonus Skills (nice to have): Apache Hadoop, Spark / Docker, Kubernetes / Grafana, Prometheus, Graylog / Jenkins / Java, Scala / Shell scripting

🏖️ Benefits

• Top-tier hardware (Mac/Linux/whatever you need) • Paid access to Coursera, Udacity, conferences, hackathons & coaching • "Smart Fridays" – our 4-day workweek to protect your flow • Flexible hours & remote work – we trust you to do your thing • JobRad + Urban Sports Club deals + free RTL+ subscription • Deutschlandticket subsidy, awesome team events , and more • Top-tier hardware (Mac/Linux/whatever you need)

Apply Now

Similar Jobs

August 20

Data Engineer collaborates with Operations and Data Science to evolve data models and pipelines. Fully remote within Germany, with Würzburg or Berlin offices.

🗣️🇩🇪 German Required

ETL

Pandas

Python

August 14

Collaborate with Data Engineering and Data Science to enhance data models and pipelines.\nEnable AI-powered sustainability in retail through scalable data infrastructure.

🗣️🇩🇪 German Required

ETL

Pandas

Python

August 8

Join virtual7 as a Data Warehouse ETL developer driving digitalization in the public sector.

🗣️🇩🇪 German Required

ETL

Oracle

SQL

July 30

Drive data architecture and processing improvements, mentoring teams in cloud automation services.

Airflow

Apache

Cloud

ETL

Java

Python

Scala

Spark

SQL

July 12

Join smartclip as a Data Engineer to architect scalable data systems for AI-driven products.

Apache

AWS

Docker

ETL

Google Cloud Platform

Grafana

Hadoop

Java

JavaScript

Jenkins

Kubernetes

Linux

Node.js

Open Source

Prometheus

Python

React

Scala

Shell Scripting

Spark

SQL

TypeScript

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com