Senior Data Engineer

Job not on LinkedIn

September 20

Apply Now
Logo of Vantage Data Centers

Vantage Data Centers

Enterprise

Vantage Data Centers is a company specializing in the design, development, and operation of highly flexible and scalable data centers. These data centers cater to hyperscalers, cloud providers, and large enterprises, offering state-of-the-art facilities across 5 continents and 21 markets. Vantage focuses on providing large-scale campuses that enable customers to grow steadily, with over 23 million square feet dedicated to data center space and 2. 6GW+ of power capacity. Committed to sustainability and efficiency, Vantage ensures 100% uptime with standardized designs offering predictability and performance. The company is backed by prominent investors such as DigitalBridge Group and Silver Lake, maintaining a strong financial standing and excellent customer satisfaction rates.

1001 - 5000 employees

Founded 2010

🏢 Enterprise

📋 Description

• Design data infrastructure to collect, process, and analyze massive retail media datasets • Use real-time and batch processing technologies to push the boundaries of data processing at scale • Lead architectural conversations and guide development of data infrastructure to ensure availability of high-quality, trustworthy data • Oversee processing and transformation of data and ensure proper data hygiene for downstream applications • Write maintainable Python code to deliver high-quality, scalable data management solutions • Document new and existing features clearly to support team alignment and future development • Coach and develop the Measurement team to raise collective capability and drive long-term growth • Communicate with stakeholders across the organization to align expectations and ensure transparency regarding data infrastructure and management

🎯 Requirements

• 5+ years of experience working in data engineering, big data, and/or distributed systems for SaaS (Software as a Service) products • Proficiency with Python, Django, and SQL • Hands-on experience using Snowflake to build ETL/ELT pipelines, data-warehousing, and analysis • Hands-on experience using Great Expectations to test and validate data quality, or a similar data testing tool • Hands-on experience with data modelling, data warehousing, and distributed systems • Experience with privacy-compliant data processing (GDPR, CCPA) for advertising/retail media use-cases • Track record of successful collaboration with other engineering and product teams • Familiarity with managing real-time streaming data using tools such as Kafka, Kinesis, and/or Pub/Sub • Bonus: Hands-on experience with data lakes and ML (Machine Learning) pipelines • Legally authorized to work in Canada (application asks for confirmation) • Successful completion of a criminal background check required in final stages of hiring

🏖️ Benefits

• Remote-first work environment • Flexible work philosophy • Remote-friendly setup • Home office support • Annual company retreats • Equal opportunity employer committed to diversity, equity, and inclusion • Reasonable accommodations available to applicants on request

Apply Now

Similar Jobs

September 10

Senior Data Engineer building and maintaining high-quality data pipelines at Instacart. Supporting cross-functional teams and ensuring data accuracy for grocery commerce.

🇨🇦 Canada – Remote

💵 $161k - $179k / year

💰 $232M Venture Round on 2021-11

⏰ Full Time

🟠 Senior

🚰 Data Engineer

Airflow

Cloud

ETL

Python

Spark

SQL

August 19

Airflow

Amazon Redshift

AWS

Azure

BigQuery

Cloud

ETL

Google Cloud Platform

IoT

Java

Kafka

Python

Scala

SQL

August 16

Senior Data Architect guiding cloud data platforms on AWS/Azure/GCP; mentors teams and enables AI/ML with Snowflake, Snowpark, Python.

AWS

Azure

Cloud

Google Cloud Platform

Node.js

Python

SQL

Tableau

August 16

Data Engineer at Trojan/Veralto builds scalable ELT/ETL pipelines with Matillion/Snowflake; enables analytics and data integrity.

AWS

Azure

Cloud

ETL

Google Cloud Platform

Matillion

August 9

Data Engineer at Versa Networks designs, builds, and maintains data pipelines; remote Canada role leveraging Airflow, Spark, Python, and cloud tech to enable AI/ML workflows.

Airflow

Apache

BigQuery

Cloud

Docker

Google Cloud Platform

Kubernetes

Python

Ray

Rust

Spark

Terraform

Go

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com