Senior Data Engineer

Job not on LinkedIn

September 10

Apply Now
Logo of Spiritual Data

Spiritual Data

Artificial Intelligence • SaaS • Productivity

Spiritual Data is an advanced analytics platform that empowers data teams through the use of artificial intelligence to enhance productivity and efficiency. Their tools, such as DataPilot, aim to automate and optimize data management tasks, revolutionizing how data engineering teams operate. By providing AI-driven insights and recommendations, Spiritual Data enables teams to improve their data documentation, quality testing, and query handling, resulting in significant cost savings and operational improvements. The company focuses on aiding analytics, platform, and data engineers by integrating their solutions seamlessly into daily workflows and popular platforms like VSCode, Python, and Slack, offering tools that auto-generate models, conduct tests, and provide documentation. Spiritual Data is a trusted solution among leading enterprises, offering robust security features without compromising on AI functionality.

📋 Description

• This remote position requires an overlap with US Pacific Timezone. • Build PB scale data pipelines that can scale to 100K+ jobs and handle PB scale data per day • Design and implement cloud-native data infrastructure on AWS using Kubernetes and Airflow • Design and develop SQL intelligence systems for query optimization, dynamic pipeline generation, and data lineage tracking • Contribute to open-source initiatives • Work with team of engineers to integrate AI into data operations

🎯 Requirements

• 8+ years of experience in data engineering, with a focus on building scalable data pipelines and systems • Strong proficiency in Python and SQL • Extensive experience with SQL query profiling, optimization, and performance tuning, preferably with Snowflake • Deep understanding of SQL Abstract Syntax Tree (AST) and experience working with SQL parsers (e.g., sqlglot) for generating column-level lineage and dynamic ETLs • Experience in building data pipelines using Airflow or dbt • [Optional] Solid understanding of cloud platforms, particularly AWS • [Optional] Familiarity with Kubernetes (K8s) for containerized deployments

🏖️ Benefits

• Competitive salary and equity • Extensive professional development opportunities and resources for continuous learning • Dynamic and intellectually stimulating work environment with a team of talented engineers • Opportunities to shape the direction of the company and leave a lasting impact

Apply Now

Similar Jobs

September 9

Lead Data Engineer building and scaling Hopscotch Primary Care's data platform, pipelines, and governance to support care delivery and analytics.

AWS

Azure

Cloud

Google Cloud Platform

PySpark

Python

September 6

Data Architect designing enterprise Databricks Lakehouse for Live Nation Entertainment. Lead architecture, modeling, governance, and mentorship across data platform.

AWS

Azure

Cloud

ETL

Hadoop

Kafka

NoSQL

Spark

September 5

Data Engineer building scalable data pipelines and platforms for ForceMetrics, a public-safety data analytics startup. Empowering internal and external stakeholders with reliable data and analytics.

Apache

BigQuery

Cyber Security

Docker

ElasticSearch

Kubernetes

MySQL

Postgres

Python

SQL

Tableau

September 5

Senior Data Engineer building and optimizing AWS data pipelines for a government-focused digital services company. Develop ETL, data architecture, and data quality solutions for federal clients.

Airflow

Amazon Redshift

Apache

AWS

Cloud

EC2

ETL

Hadoop

Java

JavaScript

MySQL

Postgres

Python

Scala

Spark

SQL

September 4

Design and operate low-latency, scalable ETL and storage solutions for Oscilar's AI-driven fraud, credit, and compliance decisioning platform; mentor engineers.

Airflow

Cloud

DynamoDB

ETL

Grafana

Java

Kafka

Postgres

Prometheus

Python

Redis

SQL

Terraform

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com