Data Engineer – Lambda

Job not on LinkedIn

October 10

Apply Now
Logo of P2P Labs & P2P Tech Services

P2P Labs & P2P Tech Services

Crypto • Finance • B2B

P2P Labs & P2P Tech Services is a leading provider of non-custodial staking infrastructure and technology tailored for intermediaries such as Web3 wallets, exchanges, and custodians. Founded in 2018, the company offers users the ability to stake cryptocurrencies like Ethereum and Bitcoin effortlessly, without needing to run their own nodes. P2P Labs delivers services that include staking APIs, data analytics, and solutions to enhance the staking experience across multiple proof-of-stake networks.

11 - 50 employees

Founded 2022

₿ Crypto

💸 Finance

🤝 B2B

📋 Description

• Design, maintain, and scale streaming ETL pipelines for blockchain data. • Build and optimize ClickHouse data models and materialized views for high-performance analytics. • Develop and maintain data exporters using orchestration tools. • Implement data transformations and decoding logic. • Establish and improve testing, monitoring, automation, and migration processes for pipelines. • Ensure timely delivery of new data features in alignment with product goals. • Write and maintain clear documentation for pipelines and workflows. • Collaborate within the team to deliver accurate, reliable, and scalable data services that power the Lambda app.

🎯 Requirements

• 4+ years in Data Engineering (ETL/ELT, data pipelines, streaming systems). • Strong SQL skills with columnar databases (ClickHouse, Druid, BigQuery, etc.). • Hands-on streaming frameworks experience (Flink, Kafka, or similar). • Solid Python skills for data engineering and backend services. • Proven track record of delivering pipelines and features to production on schedule. • Strong focus on automation, reliability, maintainability, and documentation. • Startup mindset: keep balance between speed and quality. • Nice to Have: • Experience operating ClickHouse at scale (performance tuning, partitioning, materialized views) • Experience with CI/CD and automated testing for data pipelines (e.g. GitHub Actions, dbt) • Knowledge of multi-chain ecosystems (EVM & non-EVM) • Familiarity with blockchain/crypto data structures (transactions, logs, ABI decoding). • Contributions to open-source or blockchain data infrastructure projects

🏖️ Benefits

• Fully remote • Full-time Contractor (Indefinite-term Consultancy Agreement) • Competitive salary level in $ (we can also pay in Crypto) • Well-being program • Mental Health care program • Compensation for education, including Foreign Language & professional growth courses • Equipment & co-working reimbursement program • Overseas conferences, community immersion • Positive and friendly communication culture

Apply Now

Similar Jobs

March 29

Join Cloudacio as a Data Engineer to develop data pipelines and big data solutions remotely.

Airflow

Amazon Redshift

AWS

Azure

BigQuery

Cassandra

Cloud

Docker

DynamoDB

ETL

Google Cloud Platform

Grafana

Hadoop

Kafka

Kubernetes

MongoDB

NoSQL

Python

Spark

SQL

Tableau

Terraform

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com