Senior Data Engineer

October 21

Apply Now
Logo of Dataroid

Dataroid

SaaS • eCommerce • B2B

Dataroid is a digital analytics and customer engagement platform designed to enhance customer experiences through data-driven insights. The company provides a unified customer data platform equipped with features for automated event collection, unified customer profiles, and data privacy and security. Dataroid offers digital experience analytics, including funnel and user path analytics, as well as performance monitoring and uninstallation tracking. The platform supports omnichannel customer engagement with tools for behavioral segmentation, in-app messaging, push notifications, and geotargeting. Additionally, it leverages predictive analytics for churn prediction and anomaly detection, helping businesses optimize engagement and retention strategies. Dataroid serves marketing, customer experience, product, and technology teams, offering a comprehensive solution for understanding customer behavior, enhancing digital product experience, and driving retention and growth.

📋 Description

• Designing and building large-scale resilient data pipelines • Owning end-to-end data platform architecture from SDK data ingestion to real-time analytics APIs • Building unified customer profile systems that consolidate fragmented user data across web, mobile, and IoT channels in real-time • Writing well-designed, reusable, testable, secure and scalable high-quality code that powers mission-critical analytics • Driving architectural decisions on streaming, storage, and processing layers • Collaborating with cross-functional teams including product, ML, and analytics to shape data strategy • Mentoring engineers and establishing data engineering best practices across the organization • Ensuring platform reliability and performance to meet enterprise SLAs for systems processing millions of events per second

🎯 Requirements

• BSc/MSc/PhD degree in Computer Science or a related field or equivalent work experience • 5+ years of experience in Data Engineering, Data Architecture or similar role building production systems at scale • Proven experience architecting and operating real-time analytics systems handling large volumes of data with demonstrated ability to discuss technical tradeoffs and scalability challenges • Strong experience with real-time data modeling, ETL/ELT practices, and streaming architectures at enterprise scale • Expert-level proficiency in one or more high-level Python or Java based batch and/or stream processing frameworks such as Apache Spark, Apache Flink or Kafka Streams • Production experience with columnar stores and real-time analytics databases (Druid, ClickHouse strongly preferred) • Strong experience with relational and non-relational data stores, key-value stores and search engines (PostgreSQL, ScyllaDB, Redis, Hazelcast, Elasticsearch etc.) • Hands-on experience with data workflow orchestration tools like Airflow or dbt • Deep understanding of storage formats such as Parquet, ORC and/or Avro • Strong experience with distributed systems, concurrent programming, and real-time data processing at scale • Experience with distributed storage systems like HDFS and/or S3 • Familiarity with data lake and data warehouse solutions including Hive, Iceberg, Hudi and/or Delta Lake • Strong sense of analytical thinking and problem-solving skills with ability to debug complex distributed systems • Strong verbal and written communication skills with ability to explain technical decisions to both engineers and business stakeholders

🏖️ Benefits

• private health insurance • company-supported pension plans • meal vouchers • commute assistance • remote work benefits • paid day off for your birthday • enhanced flexible working hours • access to online learning platforms like Udemy • tailored training programs • happy hours • workshops • seasonal celebrations

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com