Head of Data Engineering

Yesterday

Apply Now
Logo of Block Labs

Block Labs

Crypto • Web 3 • B2B

Block Labs is a Web3-focused builder, investor, and marketing partner for blockchain projects. The company provides end-to-end blockchain development (smart contracts, decentralized wallets, NFT exchanges, payment gateways), web3-native marketing (social management, influencer/KOL campaigns, paid media, PR, partnerships), and investment support from seed to Series A. Block Labs works with founders to accelerate growth through technical development, go-to-market strategies, and strategic capital, operating from Sofia, Bulgaria.

51 - 200 employees

Founded 2022

₿ Crypto

🌐 Web 3

🤝 B2B

📋 Description

• Set the direction for our data platform, covering ingestion, transformation, warehousing and governance. • Build and lead a strong data engineering team capable of supporting both real time and batch workloads. • Design data architectures that serve multiple products, including AI models, real time scoring systems and analytical tools. • Establish clear data standards, schemas and contracts so teams can integrate with confidence. • Put in place robust data quality checks, validation steps and reconciliation processes to ensure information is accurate and consistent. • Work closely with AI teams to provide reliable feature pipelines and training data, and to support inference workflows. • Collaborate with backend and platform teams to integrate event streams, APIs and operational data into the platform. • Implement proper observability across the data stack, including lineage, freshness, completeness and performance. • Make informed decisions on tooling across storage, orchestration, streaming, and transformation. • Oversee documentation and architectural guidelines to keep the platform coherent as it grows. • Support product and partner teams when data from casinos or external systems needs to be integrated. • Maintain awareness of infrastructure costs and design choices that keep the platform sustainable in the long run.

🎯 Requirements

• Strong experience leading data engineering teams and delivering production data platforms that support real products, not just dashboards. • Comfortable designing and operating data pipelines and warehouses that serve both operational systems and analytical or modelling needs. • Good understanding of streaming technologies, batch processing and cloud native data tooling. • Practical experience working with machine learning teams or directly with ML pipelines, including feature generation, training data preparation and model serving. • Familiar with ML operations in a real setting, such as monitoring model inputs and outputs, handling data drift, and keeping training and production environments in sync. • Hands on experience with workflow orchestrators such as Temporal, Airflow or Dagster, ideally across both data and ML workflows. • Deep knowledge of SQL and at least one modern data processing engine or framework. • Comfortable with tools that support data quality, lineage, metadata and observability across both data and model pipelines. • Able to work closely with AI, backend, product and analytics teams and explain technical decisions in clear language. • Confident in architectural decision making, setting standards, and mentoring engineers at different levels. • Someone who cares about clarity, reliability and thoughtful engineering, and who understands the trade-offs between speed, cost and long-term maintainability.

🏖️ Benefits

• Health insurance • Retirement plans • Paid time off • Flexible work arrangements • Professional development

Apply Now

Similar Jobs

3 days ago

DataOps Engineer supporting Tech Delivery and Infrastructure Operations teams in ensuring reliability and performance of analytics platforms. Collaborating with cross-functional teams and applying DevOps practices.

Ansible

AWS

Azure

Cloud

Docker

ETL

Google Cloud Platform

Grafana

Jenkins

Kubernetes

Prometheus

Python

ServiceNow

Shell Scripting

SQL

Terraform

November 20

DataOps Engineer for Nagarro supporting Tech Delivery and Infrastructure Operations teams. Ensuring reliability and performance of analytics and data platforms with key responsibilities in automation and collaboration.

Ansible

AWS

Azure

Cloud

Docker

ETL

Google Cloud Platform

Grafana

Jenkins

Kubernetes

Prometheus

Python

ServiceNow

Shell Scripting

SQL

Terraform

October 27

Leading Data Engineering at GoGlobal to build and scale data infrastructure for informed decision-making. Drive data quality and strategic partnerships, fostering an inclusive team culture.

Airflow

AWS

Azure

BigQuery

Cloud

Distributed Systems

ETL

Google Cloud Platform

Java

Kafka

NoSQL

Python

Scala

Spark

SQL

Go

August 9

Trimble Inc.

10,000+ employees

Trimble seeks a Staff Data Engineer to design cloud data infrastructures. Join a dynamic team at a leading logistics solutions provider.

Amazon Redshift

AWS

Azure

Cloud

Docker

DynamoDB

ETL

Kubernetes

Postgres

PySpark

Python

SQL

Terraform

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com