Data Engineer

Job not on LinkedIn

November 21

Apply Now
Logo of Ruby Labs

Ruby Labs

Health • Education • Entertainment

Ruby Labs is a company that creates cutting-edge consumer products designed to innovate across health, education, and entertainment sectors. With a commitment to enhancing lives and making a positive impact, Ruby Labs is at the forefront of technological progress. Their diverse portfolio reflects their dedication to innovation in vital sectors of society, specifically health and wellness, pharmaceuticals, education, and entertainment. Ruby Labs has a self-funded model that empowers them to pursue ambitious ideas independently. The company boasts over 100 million annual users and has a track record of sustained growth over 6 years, delivering products that are adopted by millions.

11 - 50 employees

Founded 2018

📚 Education

đź“‹ Description

• Develop and maintain ETL/ELT data pipelines to ingest, transform, and deliver data into the data warehouse. • Design and implement monitoring and alerting systems to proactively detect pipeline failures, anomalies, and data quality issues. • Establish data quality validation checks and anomaly detection mechanisms to ensure accuracy and trust in data. • Define and maintain data structures, schemas, and partitioning strategies for efficient and scalable data storage. • Create and maintain comprehensive documentation of data pipelines, workflows, data models, and data lineage. • Troubleshoot and resolve issues related to data pipelines, performance, and quality. • Collaborate with stakeholders to understand data requirements and translate them into reliable engineering solutions. • Contribute to continuous improvement of the data platform’s observability, reliability, and maintainability.

🎯 Requirements

• Proficiency in Python for data pipeline development, automation, and tooling. • Strong SQL skills and experience working with cloud data warehouses (ClickHouse, BigQuery preferred). • Hands-on experience with DBT (modelling, testing, documentation, and deployment). • Experience with workflow orchestration tools such as Airflow. • Familiarity with data quality frameworks (e.g. Great Expectations, DBT tests) and anomaly detection methods. • Experience building monitoring and alerting systems for data pipelines and data quality. • Ability to write clear, maintainable, and actionable technical documentation. • Strong problem-solving skills and attention to detail.

🏖️ Benefits

• Remote Work Environment: Embrace the freedom to work from anywhere, anytime, promoting a healthy work-life balance. • Unlimited PTO: Enjoy unlimited paid time off to recharge and prioritize your well-being, without counting days. • Paid National Holidays: Celebrate and relax on national holidays with paid time off to unwind and recharge. • Company-provided MacBook: Experience seamless productivity with top-notch Apple MacBooks provided to all employees who need them. • Flexible Independent Contractor Agreement: Unlock the benefits of flexibility, autonomy, and entrepreneurial opportunities. Benefit from tax advantages, networking opportunities, reduced employment obligations, and the freedom to work from anywhere.

Apply Now

Similar Jobs

September 12

Xenon Seven

11 - 50

🤖 Artificial Intelligence

🏢 Enterprise

Senior Data Engineer at Xenon7 designing and deploying time-series and neural ML models for inventory management using Python, Databricks, Snowflake, and AWS.

🇷🇸 Serbia – Remote

⏰ Full Time

đźź  Senior

đźš° Data Engineer

Developed by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com