Data Engineer

November 19

Apply Now
Logo of Thumbtack

Thumbtack

B2C • Marketplace • Home Services

Thumbtack is a platform that connects people with local professionals offering various services such as home improvement, repair, and cleaning. Users can search for specific services like house cleaning, handyman work, electrical and wiring repair, and more, getting access to trusted professionals with reviews and cost estimates. The platform is designed to make finding, hiring, and working with professionals simple and efficient, with the convenience of managing everything from one app. It serves millions of people across the U. S. , providing a wide range of services including landscaping, event planning, and more.

1001 - 5000 employees

👥 B2C

🏪 Marketplace

📋 Description

• Collaboratively refine and evangelize a comprehensive framework for integrating data-thinking into the software development lifecycle for product teams • Design, architect, and maintain core marketing datasets, data marts, and feature stores that support a blend of mature products and features with a rapidly evolving product line, in partnership with analytics, data science, and machine learning • Integrate with teams consisting of product engineers, analysts, data scientists, machine learning engineers throughout Thumbtack to understand their data needs, and help design datasets with the same engineering rigor as any other software we design • Drive data quality and best practices across different business areas • Help build the next generation data products at Thumbtack, based on real-time data products on top of Apache Kafka

🎯 Requirements

• 2 or more years of experience designing and building data sets and warehouses • Excellent ability to understand the needs of and collaborate with stakeholders in other functions, especially Analytics, and identify opportunities for process improvements across teams • Experience in SQL for analytics/reporting/business intelligence and also for building SQL- and Python-based transforms inside an ETL pipeline, or similar • Experience designing, architecting, and maintaining datasets that integrate data from multiple sources, including production databases, clickstream data, and external APIs • Familiarity building the above with a modern data stack based on a cloud-native data warehouse, in our case we use BigQuery, dbt, and Apache Airflow, but a similar stack is fine • Strong sense of ownership and pride in your work, from ideation and requirements-gathering to project completion and maintenance.

Apply Now

Similar Jobs

November 19

Data Engineer providing support and expertise in data engineering. Managing data operations and enhancing data pipelines while ensuring compliance with KPIs.

Airflow

Amazon Redshift

AWS

Cloud

EC2

Python

November 15

Data Engineer designing and optimizing data pipelines and warehouses leveraging Snowflake & DBT. Permanent, full-time remote position in Canada for a dynamic tech environment.

🗣️🇫🇷 French Required

Airflow

Azure

Cloud

ETL

MS SQL Server

NoSQL

Postgres

SQL

November 14

Data Architect supporting clients with data architecture and analytics solutions at 3Pillar. Collaborating with business leaders to derive value from their data and implement architecture roadmaps.

Amazon Redshift

AWS

Azure

DynamoDB

ETL

Google Cloud Platform

Hadoop

HDFS

Java

Kafka

Linux

NoSQL

PySpark

Python

Scala

Spark

SQL

Tableau

November 11

Data Engineer I supporting ETL pipelines for AI-powered enablement. Collaborating in a fast-paced environment and enhancing data infrastructure with Snowflake and Python.

Azure

Cloud

Python

SQL

November 5

Mid-level Data Engineer at Spotify developing data-driven solutions for financial processes. Collaborating with cross-functional teams to provide accurate and timely data insights.

Docker

Google Cloud Platform

Java

Scala

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com