Data Engineer

November 12

Apply Now
Logo of Ciklum

Ciklum

Artificial Intelligence • B2B • Enterprise

Ciklum is a global digital engineering and AI-enabled product and platform services company that helps enterprises design, build, and scale AI-infused software, cloud, data, and automation solutions. It combines UX and product design with engineering, DevOps, data engineering, responsible AI, and edge/IoT capabilities to move pilots into production and deliver enterprise-ready outcomes across industries such as banking, retail, healthcare, hi-tech, automotive, and travel. Ciklum emphasizes platform-agnostic, scalable solutions—covering AI incubators, conversational AI, agentic automation, cloud and edge services, XR/AR/VR, and digital assurance—focused on transforming workflows and customer experiences for B2B enterprise clients.

📋 Description

• Design, build, and operate our core AIOps and analytics platform on Kubernetes (starting with EKS) • Implement and optimize large-scale data ingestion pipelines using AWS EMR and Spark • Own and operate our self-managed OpenSearch cluster on Kubernetes (using the K8s operator) • Design and implement robust data lifecycle management and tiering strategies • Get hands-on implementing and integrating AI, machine learning, and generative AI into our platform • Evolve the platform's implementation to support multiple cloud providers beyond AWS • Continuously improve our data processing and models to provide new and deeper insights to our customers

🎯 Requirements

• Significant (e.g., 7+ years) experience as a hands-on data engineer, with a proven track record in a Staff, Principal, or equivalent senior implementor role • Deep, hands-on experience building, deploying, and managing data-intensive applications on Kubernetes (EKS preferred) • Proven experience running and scaling OpenSearch (or Elasticsearch) in a large-scale, self-managed production environment. Experience with the K8s operator is a major plus • Proven experience building, scaling, and optimizing complex, large-scale data pipelines in a production environment • Demonstrable experience in optimizing complex data workflows and implementing data lifecycle/tiering policies to manage and reduce storage costs • Strong knowledge of the AWS ecosystem (EMR, S3, EKS, etc.) • Proficiency in a relevant programming language (e.g., Python, Scala, or Java) • Excellent communication and collaboration skills, with a passion for working as part of a high-performing team

🏖️ Benefits

• Regular salary reviews based on performance • Corporate events including webinars, offline parties, and meetups • Internal Mobility Program • Tailored education path including full access to Udemy and company-paid certifications • 25 paid days off including 20 business days of vacation per calendar year plus 5 undocumented sick leave days • Additional health insurance • 100% company-covered Multisport card with family discounts available

Apply Now

Similar Jobs

October 17

Data Engineer responsible for ELT ingestion processes and cloud data lakehouse. Collaborating with data analysts and implementing new technologies for data quality enhancements.

Cloud

ETL

SQL

Tableau

October 10

Data Engineer/Consultant for a U.S.-based company specializing in data services. Involves cutting-edge technologies and client engagement.

AWS

Azure

Cloud

ETL

Google Cloud Platform

SQL

April 11

Join emerchantpay as a Data Engineer to improve global payment services through data automation and analytics.

AWS

Cloud

ETL

MySQL

Postgres

Python

SQL

Tableau

March 24

Join OLIANT as a Data Engineer, creating efficient data solutions for a U.S. client in Bulgaria.

Airflow

Apache

AWS

Azure

Cloud

ETL

Java

Python

Spark

SQL

March 11

Data Engineer developing high-performance data pipelines using Databricks and Azure technologies. Collaborating with multiple teams to ensure compliance and build data frameworks.

Azure

Cloud

ETL

Kafka

Oracle

PySpark

Python

SQL

Terraform

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com