
B2C • Marketplace
TaskRabbit is a platform that connects customers with skilled 'Taskers' who provide various home-related services. These services include furniture assembly, home repairs, cleaning, moving, and yard work. TaskRabbit allows users to choose a Tasker based on price, skills, and reviews, and facilitates the entire process including scheduling, payment, and communication through its platform. With a wide range of services and a focus on customer satisfaction, TaskRabbit provides a convenient solution for tackling home projects big and small.
July 15
🏄 California – Remote
🗽 New York – Remote
💵 $136k - $180k / year
⏰ Full Time
🔴 Lead
🚰 Data Engineer
🦅 H1B Visa Sponsor

B2C • Marketplace
TaskRabbit is a platform that connects customers with skilled 'Taskers' who provide various home-related services. These services include furniture assembly, home repairs, cleaning, moving, and yard work. TaskRabbit allows users to choose a Tasker based on price, skills, and reviews, and facilitates the entire process including scheduling, payment, and communication through its platform. With a wide range of services and a focus on customer satisfaction, TaskRabbit provides a convenient solution for tackling home projects big and small.
• We are seeking a Staff Data Engineer to lead the design, development, and optimization of our data infrastructure and analytics layers, enabling the creation of reliable, scalable, and high-quality data products across the company. • This role will report to the Director of Data Engineering and Applications and work closely with both technical teams and non-technical stakeholders across the business. • While this is an individual contributor role, it emphasizes mentorship and strategic guidance across both data engineering and analytics engineering functions. • The ideal candidate has deep experience building and maintaining modern data platforms using tools such as dbt, Airflow, and Snowflake (or equivalent), and brings strong expertise in data modeling, orchestration, and production-grade data pipelines. • They excel at engaging with non-technical stakeholders to understand business needs and are skilled at translating those needs into well-defined metrics, semantic models, and self-serve analytical tools. • They are comfortable shaping architectural direction, promoting best practices across the team, and thrive in environments that require cross-functional collaboration, clear communication, and a strong sense of ownership.
• Expertise in building and maintaining ELT data pipelines using modern tools such as dbt, Airflow, and Fivetran • Deep experience with cloud data warehouses such as Snowflake, BigQuery, or Redshift • Strong data modeling skills (e.g., dimensional modeling, star/snowflake schemas) to support both operational and analytical workloads • Proficient in SQL and at least one general-purpose programming language (e.g., Python, Java, or Scala) • Experience with streaming data platforms (e.g., Kafka, Kinesis, or equivalent) and real-time data processing patterns • Familiarity with infrastructure-as-code tools like Terraform and DevOps practices for managing data platform components • Hands-on experience with BI and semantic layer tools such as Looker, Mode, Tableau, or equivalent
• Taskrabbit is a Remote-First Company. • The People. You will be surrounded by some of the most talented, supportive, smart, and kind leaders and teams -- people you can be proud to work with! • The Diverse Culture. We believe that we make better decisions when our workforce reflects the diversity of the communities in which we operate. Women make up half of our leadership team and our diversity representation is above that of the tech industry average. • The Perks. Taskrabbit offers our employees with employer-paid health insurance and a 401k match with immediate vesting for our US based employees. We offer all of our global employees generous and flexible time off with 2 company-wide closure weeks, Taskrabbit product stipends, wellness + productivity + education stipends, IKEA discounts, reproductive health support, and more. Benefits vary by country of employment.
Apply NowJuly 9
Join TetraScience as a Scientific Data Architect, transforming complex scientific data into actionable outcomes.
AWS
Cloud
Python
July 8
Join NBCUniversal as a Staff Data Engineer to develop innovative data pipelines and solutions.
🗣️🇨🇳 Chinese Required
AWS
Azure
Cloud
ETL
Google Cloud Platform
Python
Spark
SQL
June 16
Design and maintain Azure cloud native integration platforms and data solutions for state agencies.
Azure
Cloud
ERP
Flask
Java
JUnit
Maven
Pandas
PySpark
Python
Spring Boot
SpringBoot
SQL
Terraform
Vault
May 31
Develop and deploy data solutions for improving efficiency in DoD and Navy IT projects.
AWS
Cloud
Cyber Security
ETL
PySpark
Python
ServiceNow
SQL
Terraform
May 9
Lead data engineering and operations to develop impactful data platforms at UNITE HERE HEALTH.
🇺🇸 United States – Remote
💵 $136k - $142k / year
⏰ Full Time
🟠 Senior
🔴 Lead
🚰 Data Engineer
🦅 H1B Visa Sponsor
Amazon Redshift
Apache
AWS
Azure
Cloud
ETL
Google Cloud Platform
Informatica
Kafka
Scala
Spark
SQL