Senior Data Engineer

Job not on LinkedIn

10 hours ago

Apply Now
Logo of Ethena Labs

Ethena Labs

Finance • Fintech • Web 3

Ethena Labs is a company focused on creating synthetic financial products, such as a Synthetic Dollar with Internet Native Yield (sUSDe). Ethena Labs aims to provide innovative financial solutions with transparency, leveraging blockchain technology. It is backed by a robust community and operates within a decentralized financial ecosystem.

📋 Description

• Rapidly spin up the cloud environment. Deliver working historical backfill pipelines from Tardis.dev into a queryable database. • Deliver a real-time Tardis WebSocket pipeline, ensuring data is normalized, cached for live consumption, accurate, replayable, and queryable by Day 60. • Ensure all pipelines are idempotent, retryable, and use exactly-once semantics. Implement full CI/CD, Terraform, automated testing, and secrets management. • Implement proper observability (structured logs, metrics, dashboards, alerting) from day one. Provide immediate self-service access to the MVP database for Trading and BI teams via tools like Tableau/Metabase, and through simple internal REST APIs. • Develop specialized timeseries data, including USDe backing-asset and a full opportunity-surface timeseries for Delta-neutral/lending/borrow opportunities. • Ingest data from additional sources (Kaiko, CoinAPI, on-chain via TheGraph/Dune). Plan for 10x+ data growth via schema evolution, partitioning, and performance tuning. Establish enterprise-grade governance, including a data quality framework, RBAC, audit logs, and a semantic layer. • Create full architecture documentation, runbooks, and a data dictionary. Onboard and mentor future junior staff.

🎯 Requirements

• Proven track record of delivering working, production data in weeks, not months, with the ability to ruthlessly cut scope to hit a 60-day MVP while managing technical debt. • Have built Tardis historical and real-time pipelines before (or equivalent high-quality crypto market data feeds), understanding specific quirks, rate limits, and WebSocket structures. • Expert in large-scale, reliable ETL/ELT for financial or market data. • Fluent in provisioning full environments with Terraform in days and expert in AWS/GCP serverless technologies. • Expert Python and SQL skills and proficiency with time-series databases like TimescaleDB or ClickHouse, ensuring fast queries from day one. • Advanced knowledge of WebSocket clients, message queues, and low-latency streaming, GitOps, automated testing/deploy and observability practices. • Significant understanding of stablecoins, lending protocols, and opportunity surface concepts, or a proven ability to ramp up extremely quickly.

🏖️ Benefits

• Flexible, remote-friendly work environment • Established opportunities for personal growth and learning

Apply Now

Similar Jobs

5 days ago

Keyrus

1001 - 5000

Data Architect / Data Vault Specialist needed for Keyrus to design and deliver cloud-native data solutions using modern tools. Collaborate globally with a multinational company.

AWS

Azure

Cloud

ETL

Kafka

MS SQL Server

Oracle

Postgres

Python

SQL

Vault

December 1

DataOps Engineer supporting Tech Delivery and Infrastructure Operations teams in ensuring reliability and performance of analytics platforms. Collaborating with cross-functional teams and applying DevOps practices.

Ansible

AWS

Azure

Cloud

Docker

ETL

Google Cloud Platform

Grafana

Jenkins

Kubernetes

Prometheus

Python

ServiceNow

Shell Scripting

SQL

Terraform

November 20

Data Architect leading solutions using Azure and Databricks at a global software engineering company. Focused on data management and architecture for diverse clients in multiple sectors.

🗣️🇧🇷🇵🇹 Portuguese Required

Azure

SQL

November 20

DataOps Engineer for Nagarro supporting Tech Delivery and Infrastructure Operations teams. Ensuring reliability and performance of analytics and data platforms with key responsibilities in automation and collaboration.

Ansible

AWS

Azure

Cloud

Docker

ETL

Google Cloud Platform

Grafana

Jenkins

Kubernetes

Prometheus

Python

ServiceNow

Shell Scripting

SQL

Terraform

November 19

Senior Data Engineer building robust data pipelines and transformations for an AI platform. Collaborating with CTO, CPO, and product teams to impact business outcomes.

BigQuery

ETL

Google Cloud Platform

Postgres

Python