
NerdWallet is on a mission to help provide clarity for consumers and SMBs so they can make financial decisions with confidence. We work hard to cultivate an award-winning culture in which our Nerds can realize this mission, and we pride ourselves on the programs we’ve created to positively impact the lives of our consumers, our Nerds, and our wider world.
501 - 1000 employees
Founded 2009
💰 Secondary Market on 2021-05
October 16

NerdWallet is on a mission to help provide clarity for consumers and SMBs so they can make financial decisions with confidence. We work hard to cultivate an award-winning culture in which our Nerds can realize this mission, and we pride ourselves on the programs we’ve created to positively impact the lives of our consumers, our Nerds, and our wider world.
501 - 1000 employees
Founded 2009
💰 Secondary Market on 2021-05
• Lead the design, development, and maintenance of business-critical data assets, ensuring they are accurate, reliable, and aligned with evolving business priorities • Drive technical innovation and process excellence, evaluating emerging technologies and implementing scalable, efficient solutions that improve data pipeline performance and reliability • Tackle complex technical challenges - balancing scalability, security, and performance - while providing clear rationale for architectural decisions and aligning outcomes across teams • Ensure data pipeline reliability and observability, proactively identifying and resolving issues, investigating anomalies, and improving monitoring to safeguard data integrity • Build trust and alignment across cross-functional teams through transparent communication, collaborative problem-solving, and a deep understanding of partner needs • Bring clarity and direction to ambiguity, taking ownership of initiatives that span multiple domains or teams, and providing technical leadership to ensure successful delivery • Prioritize work strategically, balancing business impact, risk, and execution to drive measurable outcomes that support organizational goals • Act as a trusted technical advisor and thought leader, shaping the team’s long-term architecture and influencing best practices • Foster a culture of technical excellence and continuous learning, mentoring engineers and championing modern data engineering practices, including AI and automation-enabled solutions
• 7+ years of relevant professional experience in data engineering • 5+ years of experience with AWS, Snowflake, DBT, Airflow • Advanced level of proficiency in Python and SQL • Working knowledge of relational databases and query performance tuning (SQL) • Working knowledge of streaming technologies such as Storm, Kafka, Kinesis, and Flume • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field (or equivalent professional experience) • Advanced level of proficiency applying principles of logical thinking to define problems, collect data, establish facts, and draw valid conclusions • Experience designing, building and operating robust data systems with reliable monitoring and logging practices • Strong communication skills, both written and verbal, with the ability to articulate information to team members of all levels and various amounts of applicable knowledge throughout the organization
• Monthly Healthcare Stipend • Rejuvenation Policy – Vacation Time Off + You will receive the official public holidays in your province + 4 Mental Health Days Off • Paid sabbatical for Nerds to recharge, gain knowledge and pursue their interests • Monthly Wellness Stipend, Wifi Stipend, and Cell Phone Stipend • Work from home equipment stipend
Apply NowSeptember 24
51 - 200
Lead data platform architecture and build petabyte-scale pipelines with Snowflake, DBT, and Airflow to enable Docker analytics.
🇨🇦 Canada – Remote
💵 $200.4k - $275.6k / year
💰 $105M Series C on 2022-03
⏰ Full Time
🔴 Lead
🚰 Data Engineer
Airflow
Amazon Redshift
Apache
AWS
Azure
Cloud
Docker
Google Cloud Platform
Kubernetes
Python
SQL
September 13
51 - 200
Principal Data Engineer leading Docker's data platform architecture. Building scalable data infrastructure and governance to support millions of developers.
Airflow
Apache
AWS
Azure
BigQuery
Cloud
Distributed Systems
Docker
ERP
ETL
Google Cloud Platform
Java
Kafka
Python
Rust
Scala
SQL
Terraform
Go
June 25
As a Pre-Training Data Engineer, you'll manage training data for AI models at Cohere.
Apache
Pandas
Python
Spark
May 23
Join Tiger Analytics as an AWS Data Engineer to build scalable data solutions for Fortune 500 clients.
Airflow
Amazon Redshift
Apache
AWS
Cloud
PySpark
Spark
SQL
February 8
Join Tiger Analytics as a Data Engineer, leveraging your skills in data science and machine learning.
AWS
Cloud
EC2
Java
MySQL
NoSQL
Postgres
Python
Scala
SQL