Forward Deployed Software Engineer, Data

Job not on LinkedIn

2 days ago

Apply Now
Logo of Spiritual Data

Spiritual Data

Artificial Intelligence • SaaS • Productivity

Spiritual Data is an advanced analytics platform that empowers data teams through the use of artificial intelligence to enhance productivity and efficiency. Their tools, such as DataPilot, aim to automate and optimize data management tasks, revolutionizing how data engineering teams operate. By providing AI-driven insights and recommendations, Spiritual Data enables teams to improve their data documentation, quality testing, and query handling, resulting in significant cost savings and operational improvements. The company focuses on aiding analytics, platform, and data engineers by integrating their solutions seamlessly into daily workflows and popular platforms like VSCode, Python, and Slack, offering tools that auto-generate models, conduct tests, and provide documentation. Spiritual Data is a trusted solution among leading enterprises, offering robust security features without compromising on AI functionality.

📋 Description

• Lead the technical implementation, integration, and optimization of Altimate AI's platform within complex enterprise data environments. • Serve as the primary technical point of contact for key accounts, translating customer needs into technical requirements for the product and engineering teams. • Develop custom scripts, integrations, and tools using Python to extend the capabilities of our platform and ensure seamless operation within customer ecosystems. • Troubleshoot, diagnose, and resolve complex technical issues related to data connectivity, performance, and workflow integration. • Contribute to the core product by identifying common integration patterns and developing reusable components or product features. • Conduct technical workshops and provide expert guidance on data best practices, leveraging tools like Snowflake, Databricks, and dbt.

🎯 Requirements

• 4+ years of professional software engineering or data engineering experience • Bachelor's or Master's degree in Computer Science or a related technical field • Expert proficiency in Python • Deep, hands-on experience with the modern data stack, including: • Cloud platforms (AWS, GCP, or Azure) • Data Warehouses/Lakes (Snowflake, Databricks, or similar) • Data Transformation/Modeling (DBT) • Workflow Orchestration (Airflow or similar)

🏖️ Benefits

• Competitive salary • Flexible working hours • Professional development opportunities

Apply Now

Similar Jobs

October 20

Custody Engineering Lead at BitMEX enhancing security and efficiency of custody systems for cryptocurrency. Collaborating with teams to safeguard and expand digital asset services.

Python

August 28

Senior Software Engineer building Supabase Storage's server, CDN, client libraries, and infrastructure. Focus on performance, scalability, and developer experience in a remote APAC role.

AWS

Azure

Cloud

Distributed Systems

Docker

FFmpeg

Google Cloud Platform

Grafana

iOS

JavaScript

Kubernetes

Next.js

Node.js

Open Source

Postgres

Prometheus

React

TypeScript

July 29

Canonical

501 - 1000

Work with global cloud partners to deliver high-quality Ubuntu kernels for optimal open-source experience.

Bootstrap

Cloud

IoT

Linux

Open Source

Python

July 2

Canonical

501 - 1000

Join Canonical as a Senior Software Engineer, driving cloud automation solutions with Go.

Ansible

Chef

Cloud

Distributed Systems

Docker

IoT

Kubernetes

Linux

NoSQL

Open Source

OpenStack

Puppet

Rust

SQL

Terraform

Go

July 2

Canonical

501 - 1000

Join Canonical as a Software Engineer, developing the Juju project for distributed computing.

Ansible

Chef

Cloud

Distributed Systems

Docker

IoT

Kubernetes

Linux

NoSQL

Open Source

OpenStack

Puppet

SQL

Terraform

Go

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com