Data Engineer

Job not on LinkedIn

November 19

Apply Now
Logo of Ruby Labs

Ruby Labs

Health • Education • Entertainment

Ruby Labs is a company that creates cutting-edge consumer products designed to innovate across health, education, and entertainment sectors. With a commitment to enhancing lives and making a positive impact, Ruby Labs is at the forefront of technological progress. Their diverse portfolio reflects their dedication to innovation in vital sectors of society, specifically health and wellness, pharmaceuticals, education, and entertainment. Ruby Labs has a self-funded model that empowers them to pursue ambitious ideas independently. The company boasts over 100 million annual users and has a track record of sustained growth over 6 years, delivering products that are adopted by millions.

11 - 50 employees

Founded 2018

📚 Education

📋 Description

• Develop and maintain ETL/ELT data pipelines to ingest, transform, and deliver data into the data warehouse. • Design and implement monitoring and alerting systems to proactively detect pipeline failures, anomalies, and data quality issues. • Establish data quality validation checks and anomaly detection mechanisms to ensure accuracy and trust in data. • Define and maintain data structures, schemas, and partitioning strategies for efficient and scalable data storage. • Create and maintain comprehensive documentation of data pipelines, workflows, data models, and data lineage. • Troubleshoot and resolve issues related to data pipelines, performance, and quality. • Collaborate with stakeholders to understand data requirements and translate them into reliable engineering solutions. • Contribute to continuous improvement of the data platform’s observability, reliability, and maintainability.

🎯 Requirements

• Proficiency in Python for data pipeline development, automation, and tooling. • Strong SQL skills and experience working with cloud data warehouses (ClickHouse, BigQuery preferred). • Hands-on experience with DBT (modelling, testing, documentation, and deployment). • Experience with workflow orchestration tools such as Airflow. • Familiarity with data quality frameworks (e.g. Great Expectations, DBT tests) and anomaly detection methods. • Experience building monitoring and alerting systems for data pipelines and data quality. • Ability to write clear, maintainable, and actionable technical documentation. • Strong problem-solving skills and attention to detail.

🏖️ Benefits

• Remote Work Environment: Embrace the freedom to work from anywhere, anytime, promoting a healthy work-life balance. • Unlimited PTO: Enjoy unlimited paid time off to recharge and prioritize your well-being, without counting days. • Paid National Holidays: Celebrate and relax on national holidays with paid time off to unwind and recharge. • Company-provided MacBook: Experience seamless productivity with top-notch Apple MacBooks provided to all employees who need them. • Flexible Independent Contractor Agreement: Unlock the benefits of flexibility, autonomy, and entrepreneurial opportunities. Benefit from tax advantages, networking opportunities, reduced employment obligations, and the freedom to work from anywhere.

Apply Now

Similar Jobs

November 19

Київстар

1001 - 5000

📡 Telecommunications

Data Warehouse Administrator managing the Data Warehouse (DWH) infrastructure for Kyivstar.Tech. Optimizing performance, ensuring security, and collaborating with ETL developers for reliable operation.

🇺🇦 Ukraine – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

ETL

Greenplum

Informatica

Linux

MS SQL Server

Oracle

Postgres

Python

Shell Scripting

SQL

TCP/IP

Unix

November 4

Intetics

501 - 1000

🤖 Artificial Intelligence

🏢 Enterprise

Data Engineer role at Intetics Inc. focusing on designing Apache Airflow pipelines and optimizing large-scale ETL workflows. Collaborating with cross-functional teams on data processes and solutions.

🇺🇦 Ukraine – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

October 22

Sigma Software Group

1001 - 5000

🎮 Gaming

📡 Telecommunications

Data Engineer leading a small team to modernize a data platform into a cloud-based analytics project at Sigma Software.

🇺🇦 Ukraine – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

October 12

Valtech

5001 - 10000

🤝 B2B

☁️ SaaS

Lead Data Engineer at Valtech designing and optimizing large-scale data solutions. Collaborating across technologies like Databricks, AWS, and GCP.

🇺🇦 Ukraine – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

October 12

Valtech

5001 - 10000

🤝 B2B

☁️ SaaS

Senior Data Engineer designing and optimizing data solutions for analytics and AI at Valtech. Collaborating with teams on batch and streaming data pipelines in a dynamic environment.

🇺🇦 Ukraine – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

Developed by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com