Senior Data Engineer

October 22

Apply Now
Logo of Relay

Relay

Crypto • Web 3

Relay is a cryptocurrency trading interface that provides on-chain token swaps, buy/sell functionality, vaults, transaction history, and wallet connectivity (e. g. , “Connect Wallet”). The UI text and controls (Swap, Select Token, ETH/Base, Expand Chart) indicate it’s a decentralized exchange or swap dashboard focused on token trading and liquidity features on blockchain networks.

11 - 50 employees

₿ Crypto

🌐 Web 3

📋 Description

• Design and build reliable ETL/ELT pipelines to extract data from our TypeScript production systems into our data warehouse • Instrument event tracking and data emission from our cross-chain relayer and application backend • Process and transform blockchain transaction data, cross-chain events, and user activity across 85+ chains • Build data models that support analytics on transaction volume, capital efficiency, bridge performance, and user behavior • Ensure data quality, consistency, and reliability across all pipelines • Establish data orchestration infrastructure using tools like Airflow, Dagster, or Prefect • Implement monitoring, alerting, and observability for data pipelines • Optimize query performance and data warehouse costs • Build self-service data tools and documentation for the analytics team • Partner with backend engineers to understand data schemas and implement proper data capture • Work closely with our analytics lead to understand reporting needs and design appropriate data models • Collaborate with product and business teams to define metrics and ensure data availability • Document data flows, schemas, and best practices

🎯 Requirements

• 3-5+ years of experience building data pipelines and working with data warehouses • Strong proficiency in SQL and data modeling techniques • Experience with Python for data processing and pipeline orchestration • Familiarity with TypeScript/Node.js applications and how to extract data from them (or willingness to learn quickly) • Experience with modern data stack tools (dbt, Airflow/Dagster/Prefect, Fivetran/Airbyte, etc.) • Knowledge of data warehouse platforms (Snowflake, BigQuery, Redshift, or similar)

🏖️ Benefits

• Competitive base salary ($200K+ depending on experience) • Equity package • Comprehensive health, dental, and vision insurance • Annual company offsite • Unlimited PTO policy with encouraged minimum of 2 weeks annually • Remote-first culture with emphasis on asynchronous communication and flexibility • Opportunity to build foundational data infrastructure for the leading cross-chain payments platform

Apply Now

Similar Jobs

October 22

Data Engineer for Nest Veterinary designing and scaling data systems. Join a mission-driven team to advance technology in veterinary care with ETL pipelines and real-time data analytics.

BigQuery

Cloud

ETL

Google Cloud Platform

Java

Pandas

Python

SQL

October 22

Senior Data Engineer designing and maintaining data processing pipelines for financial insights and analytics. Collaborating across teams to develop scalable solutions for data analysis.

Airflow

Amazon Redshift

Apache

AWS

Azure

BigQuery

Cloud

ETL

Python

Spark

SQL

October 22

Data Engineer at Netflix collaborating with engineers to build scalable data pipelines. Transforming telemetry data into performance metrics for improved Quality of Experience.

Python

Spark

SQL

October 22

Data Engineer maintaining ETL projects at leading cardiac data management company. Collaborating with data, engineering, and clinical teams to build scalable data workflows.

Airflow

BigQuery

Cloud

Docker

ETL

Python

SQL

Tableau

October 22

Senior Data Engineer developing large-scale data infrastructure and pipelines at Alludo. Supporting cross-functional teams and ensuring data accuracy and consistency.

Airflow

AWS

Cloud

ETL

Google Cloud Platform

Kafka

Kubernetes

Spark

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com