Data Engineer - Trojan

Job not on LinkedIn

August 16

Apply Now
Logo of Veralto

Veralto

B2B • Energy • Science

Veralto is a global enterprise comprising 13 operating companies and over 300 locations worldwide. With a workforce of 16,000 associates, Veralto focuses on impactful work in areas crucial to everyday life, such as water, food, and medicine. The company's Water Quality division manages, treats, purifies, and protects water on a global scale, while the Product Quality & Innovation division ensures the safety and authenticity of essential goods in the global supply chain. Committed to fostering a diverse and inclusive workplace, Veralto invests in its employees' growth through hands-on learning and career development opportunities, supported by a global network and the resources of an S&P 500 company.

📋 Description

• Design, develop, and maintain scalable ELT/ETL pipelines using Matillion and Snowflake to support diverse data integration and transformation needs across the company. • Architect end-to-end data workflows that ensure high performance, reliability, and data integrity for both batch and near real-time use cases. • Collaborate with cross-functional teams including Data Analysts, DevOps, and business stakeholders to gather requirements and deliver data solutions that drive value. • Define and implement best practices for data modeling, metadata management, data lineage, and governance, utilizing the features of Snowflake and Matillion. • Optimize data storage, retrieval, and computation to ensure efficient processing and cost control within our cloud infrastructure. • Monitor, troubleshoot, and resolve issues related to data pipelines, performance bottlenecks, and data quality challenges

🎯 Requirements

• 3+ years of continuous experience working with Matillion ELT/ETL for cloud data warehouses, including designing complex orchestration jobs, transformation components, and API integrations. • Advanced knowledge of Snowflake, including schema design, security, performance tuning, stream and tasks, and cost optimization strategies. • Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, or a related field, or equivalent work experience. • 5+ years of professional experience in data engineering, with at least 2 years in a senior role. • Proven experience in architecting large-scale, distributed data systems and implementing data lakes, warehouses, and marts. • Deep understanding of data modeling (dimensional, normalized, and denormalized), data governance, and data quality frameworks. • It would be a plus if you also possess previous experience in: • Strong grasp of cloud platforms (AWS, Azure, or GCP) as they relate to data storage, processing, and security. Certification or working toward certification in Matillion, Snowflake • Hands-on experience with data cataloging tools and metadata management frameworks. • Exposure to machine learning workflows and MLOps in a data engineering context

🏖️ Benefits

• Flexible working hours • Professional onboarding and training options • Powerful team looking forward to working with you • Career coaching and development opportunities • Health benefits • 401(k)

Apply Now

Similar Jobs

August 9

Data Engineer at Versa Networks designs, builds, and maintains data pipelines; remote Canada role leveraging Airflow, Spark, Python, and cloud tech to enable AI/ML workflows.

Airflow

Apache

BigQuery

Cloud

Docker

Google Cloud Platform

Kubernetes

Python

Ray

Rust

Spark

Terraform

Go

July 5

Become a key contributor in data engineering at Unity, enhancing data pipelines for machine learning.

AWS

Cloud

Google Cloud Platform

Java

Python

PyTorch

Scala

Spark

Tensorflow

Unity

July 5

Join Unity to enhance data pipelines for Deep Learning models in AdTech.

AWS

Cloud

Google Cloud Platform

Java

Python

PyTorch

Scala

Spark

Tensorflow

Unity

May 23

Join Tiger Analytics as an AWS Data Engineer to build scalable data solutions for Fortune 500 clients.

Airflow

Amazon Redshift

Apache

AWS

Cloud

PySpark

Spark

SQL

April 22

Join Sunrise Robotics to design and implement data processes enhancing intelligent robotics in manufacturing.

Airflow

Apache

Assembly

Cassandra

Cloud

Grafana

Java

MongoDB

Python

Scala

Spark

SQLite

Terraform

Unity

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com