Data Engineer

Job not on LinkedIn

November 21

Apply Now
Logo of Ruby Labs

Ruby Labs

Health • Education • Entertainment

Ruby Labs is a company that creates cutting-edge consumer products designed to innovate across health, education, and entertainment sectors. With a commitment to enhancing lives and making a positive impact, Ruby Labs is at the forefront of technological progress. Their diverse portfolio reflects their dedication to innovation in vital sectors of society, specifically health and wellness, pharmaceuticals, education, and entertainment. Ruby Labs has a self-funded model that empowers them to pursue ambitious ideas independently. The company boasts over 100 million annual users and has a track record of sustained growth over 6 years, delivering products that are adopted by millions.

11 - 50 employees

Founded 2018

📚 Education

📋 Description

• Develop and maintain ETL/ELT data pipelines to ingest, transform, and deliver data into the data warehouse. • Design and implement monitoring and alerting systems to proactively detect pipeline failures, anomalies, and data quality issues. • Establish data quality validation checks and anomaly detection mechanisms to ensure accuracy and trust in data. • Define and maintain data structures, schemas, and partitioning strategies for efficient and scalable data storage. • Create and maintain comprehensive documentation of data pipelines, workflows, data models, and data lineage. • Troubleshoot and resolve issues related to data pipelines, performance, and quality. • Collaborate with stakeholders to understand data requirements and translate them into reliable engineering solutions. • Contribute to continuous improvement of the data platform’s observability, reliability, and maintainability.

🎯 Requirements

• Proficiency in Python for data pipeline development, automation, and tooling. • Strong SQL skills and experience working with cloud data warehouses (ClickHouse, BigQuery preferred). • Hands-on experience with DBT (modelling, testing, documentation, and deployment). • Experience with workflow orchestration tools such as Airflow. • Familiarity with data quality frameworks (e.g. Great Expectations, DBT tests) and anomaly detection methods. • Experience building monitoring and alerting systems for data pipelines and data quality. • Ability to write clear, maintainable, and actionable technical documentation. • Strong problem-solving skills and attention to detail.

🏖️ Benefits

• Remote Work Environment: Embrace the freedom to work from anywhere, anytime, promoting a healthy work-life balance. • Unlimited PTO: Enjoy unlimited paid time off to recharge and prioritize your well-being, without counting days. • Paid National Holidays: Celebrate and relax on national holidays with paid time off to unwind and recharge. • Company-provided MacBook: Experience seamless productivity with top-notch Apple MacBooks provided to all employees who need them. • Flexible Independent Contractor Agreement: Unlock the benefits of flexibility, autonomy, and entrepreneurial opportunities. Benefit from tax advantages, networking opportunities, reduced employment obligations, and the freedom to work from anywhere.

Apply Now

Similar Jobs

November 21

Wavestone

1001 - 5000

☁️ SaaS

🔒 Cybersecurity

Senior Data Engineer working on Azure & AI at Wavestone, supporting data engineering and analytics for business decision-making. Collaborating with clients and technical teams on cloud solutions.

🇵🇱 Poland – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

🗣️🇩🇪 German Required

🗣️🇵🇱 Polish Required

November 19

Jamf

1001 - 5000

🏢 Enterprise

📚 Education

Data Engineer II at Jamf responsible for building and transforming data infrastructure for business intelligence. Collaborating with analysts and data scientists to ensure data reliability and performance.

🇵🇱 Poland – Remote

💰 $300M Post-IPO Secondary on 2021-09

⏰ Full Time

🟢 Junior

🟡 Mid-level

🚰 Data Engineer

November 19

Infotree Global Solutions

1001 - 5000

🎯 Recruiter

👥 HR Tech

🏢 Enterprise

Data Engineer specializing in Data Mesh development for a technology company optimizing data processing infrastructure across products. Seeking an experienced professional with strong Apache Spark skills.

🇵🇱 Poland – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

November 18

Provectus

501 - 1000

🤖 Artificial Intelligence

☁️ SaaS

Data Engineer at Provectus, focusing on Data Engineering and Machine Learning with a diverse team. Collaborating on data-driven architectures and innovative data platforms.

🇵🇱 Poland – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

November 15

Senior Data Engineer responsible for implementing Microsoft Data Intelligence solutions. Collaborate with clients to analyze large data sets and support business decisions using Azure.

🇵🇱 Poland – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

🗣️🇩🇪 German Required

🗣️🇵🇱 Polish Required

Developed by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com