Senior Data Engineer

November 25

Apply Now
Logo of Button

Button

SaaS • eCommerce • B2B

Button is a B2B SaaS platform that optimizes mobile commerce by routing users to the highest-converting destination using AI-driven smart routing and deep-linking. It helps retailers, publishers, advertisers and creators stack affiliate, retail media, and seller budgets on a single platform, drive in-app conversions, and provide privacy-safe, deterministic attribution and performance analytics. Button integrates with mobile measurement partners (MMPs) and offers plug-and-play solutions to increase revenue per tap, app engagement, and publisher monetization.

51 - 200 employees

Founded 2014

☁️ SaaS

🛍️ eCommerce

🤝 B2B

📋 Description

• Collaborate closely with data scientists, analysts, product managers, and ML and infrastructure to design, build, and deploy reusable components for data ingestion, processing, modeling, and machine learning. • Partner with Engineering and Data leadership to shape and execute Button’s data strategy and roadmap. • Build and operate robust, scalable ELT pipelines processing billions of events using BigQuery, dbt, Airflow, and/or AWS; including custom integrations with external APIs as needed. • Implement comprehensive observability for critical business processes through monitoring, alerting, logging, tracing, and defining clear objectives for freshness, latency, and reliability. • Deliver accurate, well-modeled datasets that satisfy both functional and non-functional business requirements, including standards for privacy, security, scalability, and cost. • Troubleshoot and resolve data issues end-to-end, driving root-cause analysis and preventative solutions. • Own and evolve Button’s data infrastructure lifecycle, including architecture design, data quality/governance, storage optimization, orchestration, access controls, and cost management. • Document best practices and actively mentor teammates through code reviews, documentation, and design discussions.

🎯 Requirements

• 7+ years of SQL experience • 7+ years of experience building, maintaining, and supporting high-volume data systems and infrastructure • Experience with custom API integrations • Experience with cloud data warehouses such as BigQuery and Snowflake • Experience with Python, Apache Airflow, and dbt • Experience with AWS services such as RDS, Aurora, and S3 • Experience with relational and non-relational databases (PostgreSQL, MySQL, DynamoDB, Redis) • Experience with infrastructure as code (Terraform is a plus) • Experience with AWS infrastructure and GCP infrastructure not listed above is a plus

🏖️ Benefits

• 401(k) plan with 3% annual company contribution • Unlimited time off including birthdays off • Periodic Mental Health Weeks • Employee assistance program • 100% insurance premium coverage for employees and 75% for dependents • Complimentary One Medical memberships for employees and dependents • Monthly stipend for mobile phone/internet • Annual lifestyle stipend • All Access memberships to WeWork in select markets • Regular coworking days and social events

Apply Now

Similar Jobs

November 25

Engenheiro de Dados SR implementando e monitorando pipelines de dados. Empresa oferece soluções de tecnologia disruptivas com foco em Robotização, Inteligência Artificial e Analytics.

🗣️🇧🇷🇵🇹 Portuguese Required

Airflow

Cloud

ETL

Google Cloud Platform

NoSQL

Python

Spark

SQL

November 25

Senior Data Engineer improving real-world entity identification datasets for Demandbase's account-based GTM strategies. Leading initiatives and collaborating on machine learning model development within a dynamic engineering team.

Airflow

Apache

AWS

EC2

Java

Scala

SDLC

Spark

SQL

November 25

Software Engineer building scalable data products and APIs for Demandbase’s B2B data platform. Collaborating with teams to enhance data-driven products.

AWS

BigQuery

Cloud

ETL

Google Cloud Platform

Java

Kafka

Postgres

Python

React

Scala

TypeScript

Go

November 25

Senior Data Engineer at People Data Labs building scalable data solutions and infrastructure using modern data tools and technologies.

Airflow

Amazon Redshift

Apache

AWS

Azure

BigQuery

Cloud

Google Cloud Platform

Java

Python

Scala

Spark

SQL

November 25

Data Engineer responsible for building data infrastructures at People Data Labs. Working with Spark, SQL, and AWS for diverse data transformations in a dynamic environment.

Airflow

Amazon Redshift

Apache

AWS

Azure

BigQuery

Cloud

Google Cloud Platform

Java

Python

Scala

Spark

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com