
B2B • Energy • Science
Veralto is a global enterprise comprising 13 operating companies and over 300 locations worldwide. With a workforce of 16,000 associates, Veralto focuses on impactful work in areas crucial to everyday life, such as water, food, and medicine. The company's Water Quality division manages, treats, purifies, and protects water on a global scale, while the Product Quality & Innovation division ensures the safety and authenticity of essential goods in the global supply chain. Committed to fostering a diverse and inclusive workplace, Veralto invests in its employees' growth through hands-on learning and career development opportunities, supported by a global network and the resources of an S&P 500 company.
August 16

B2B • Energy • Science
Veralto is a global enterprise comprising 13 operating companies and over 300 locations worldwide. With a workforce of 16,000 associates, Veralto focuses on impactful work in areas crucial to everyday life, such as water, food, and medicine. The company's Water Quality division manages, treats, purifies, and protects water on a global scale, while the Product Quality & Innovation division ensures the safety and authenticity of essential goods in the global supply chain. Committed to fostering a diverse and inclusive workplace, Veralto invests in its employees' growth through hands-on learning and career development opportunities, supported by a global network and the resources of an S&P 500 company.
• Design, develop, and maintain scalable ELT/ETL pipelines using Matillion and Snowflake to support diverse data integration and transformation needs across the company. • Architect end-to-end data workflows that ensure high performance, reliability, and data integrity for both batch and near real-time use cases. • Collaborate with cross-functional teams including Data Analysts, DevOps, and business stakeholders to gather requirements and deliver data solutions that drive value. • Define and implement best practices for data modeling, metadata management, data lineage, and governance, utilizing the features of Snowflake and Matillion. • Optimize data storage, retrieval, and computation to ensure efficient processing and cost control within our cloud infrastructure. • Monitor, troubleshoot, and resolve issues related to data pipelines, performance bottlenecks, and data quality challenges
• 3+ years of continuous experience working with Matillion ELT/ETL for cloud data warehouses, including designing complex orchestration jobs, transformation components, and API integrations. • Advanced knowledge of Snowflake, including schema design, security, performance tuning, stream and tasks, and cost optimization strategies. • Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, or a related field, or equivalent work experience. • 5+ years of professional experience in data engineering, with at least 2 years in a senior role. • Proven experience in architecting large-scale, distributed data systems and implementing data lakes, warehouses, and marts. • Deep understanding of data modeling (dimensional, normalized, and denormalized), data governance, and data quality frameworks. • It would be a plus if you also possess previous experience in: • Strong grasp of cloud platforms (AWS, Azure, or GCP) as they relate to data storage, processing, and security. Certification or working toward certification in Matillion, Snowflake • Hands-on experience with data cataloging tools and metadata management frameworks. • Exposure to machine learning workflows and MLOps in a data engineering context
• Flexible working hours • Professional onboarding and training options • Powerful team looking forward to working with you • Career coaching and development opportunities • Health benefits • 401(k)
Apply NowAugust 9
Data Engineer at Versa Networks designs, builds, and maintains data pipelines; remote Canada role leveraging Airflow, Spark, Python, and cloud tech to enable AI/ML workflows.
Airflow
Apache
BigQuery
Cloud
Docker
Google Cloud Platform
Kubernetes
Python
Ray
Rust
Spark
Terraform
Go
July 5
Become a key contributor in data engineering at Unity, enhancing data pipelines for machine learning.
AWS
Cloud
Google Cloud Platform
Java
Python
PyTorch
Scala
Spark
Tensorflow
Unity
July 5
Join Unity to enhance data pipelines for Deep Learning models in AdTech.
AWS
Cloud
Google Cloud Platform
Java
Python
PyTorch
Scala
Spark
Tensorflow
Unity
May 23
Join Tiger Analytics as an AWS Data Engineer to build scalable data solutions for Fortune 500 clients.
Airflow
Amazon Redshift
Apache
AWS
Cloud
PySpark
Spark
SQL
April 22
Join Sunrise Robotics to design and implement data processes enhancing intelligent robotics in manufacturing.
Airflow
Apache
Assembly
Cassandra
Cloud
Grafana
Java
MongoDB
Python
Scala
Spark
SQLite
Terraform
Unity