Data Engineer – AI / ML

November 18

Apply Now
Logo of Satalia

Satalia

Artificial Intelligence • B2B • Enterprise

Satalia is a company that leverages advanced technologies to optimize and improve various business operations. They provide a platform for cloud development and data-driven services, employing cloud architects and data scientists to deliver innovative solutions. Satalia operates remotely, with locations in Greece, London, Kaunas, and Vienna. They focus on integrating analytics and marketing strategies to enhance user experience and functionality on their platforms.

51 - 200 employees

Founded 2010

🤖 Artificial Intelligence

🤝 B2B

🏢 Enterprise

💰 Grant on 2016-02

📋 Description

• Collaborate closely with data scientists, architects, and other stakeholders to understand and break down business requirements • Collaborate on schema design, data contracts, and architecture decisions, ensuring alignment with AI/ML needs • Provide data engineering support for AI model development and deployment, ensuring data scientists have access to the data they need in the format they need it • Leverage cloud-native tools (GCP/AWS/Azure) for orchestrating data pipelines, AI inference workloads, and scalable data services • Develop and maintain APIs for data services and serving model predictions • Support the development of LLM-powered features, including prompt engineering, LLM calls, agentic frameworks, vector databases and Retrieval-Augmented Generation (RAG) pipelines • Implement and optimize data transformations and ETL/ELT processes, using appropriate data engineering tools • Work with a variety of databases and data warehousing solutions to store and retrieve data efficiently • Implement monitoring, troubleshooting, and maintenance procedures for data pipelines to ensure the high quality of data and optimize performance • Participate in the creation and ongoing maintenance of documentation, including data flow diagrams, architecture diagrams, data dictionaries, data catalogues, and process documentation

🎯 Requirements

• High proficiency in Python and SQL • Strong knowledge of data structures, data modelling, and database operation • Proven hands-on experience building and deploying data solutions on a major cloud platform (AWS, GCP, or Azure) • Familiarity with containerization technologies such as Docker and Kubernetes • Demonstrable experience building, implementing, and optimizing robust data pipelines for performance, reliability, and cost-effectiveness in a cloud-native environment • Experience in supporting data science workloads and working with both structured and unstructured data • Experience working with both relational (e.g., PostgreSQL, MySQL) and NoSQL databases • Experience with a big data processing framework (e.g., Spark)

🏖️ Benefits

• enhanced pension • life assurance • income protection • private healthcare • Remote working - café, bedroom, beach - wherever works • Truly flexible working hours - school pick up, volunteering, gym • Generous Leave - 27 days holiday plus bank holidays and enhanced family leave • Annual bonus - when Satalia does well, we all do well • Impactful projects - focus on bringing meaningful social and environmental change • People oriented culture - wellbeing is a priority, as is being a nice person • Transparent and open culture - you will be heard • Development - focus on bringing the best out of each other

Apply Now

Similar Jobs

November 14

Lead Data Engineer driving the development of revenue intelligence platform for scale-ups at RocketX. Focus on building data platforms and pipelines in cloud environments.

Cloud

Python

SQL

Terraform

TypeScript

November 12

Data Engineer in Business Intelligence and Data Warehousing Services team designing and implementing data pipelines. Resolving data issues and managing large-scale data processing systems.

Azure

BigQuery

Cloud

ETL

Google Cloud Platform

Hadoop

Java

Kafka

MongoDB

MySQL

NoSQL

Python

Scala

Spark

SQL

November 8

Senior Data Engineer designing, building, and maintaining data architecture for efood. Collaborating with Product and Engineering teams to empower data-driven decision making.

AWS

Azure

Cloud

Docker

Kubernetes

Python

SQL

February 8

Join European Dynamics as a Data Engineer, designing databases and ETLs for international IT projects in a remote setup.

Cassandra

ETL

Java

MongoDB

MySQL

NoSQL

Oracle

Postgres

Python

SQL

January 25

Join as a Data Engineer designing big data solutions on a cloud platform, with a remote option.

Apache

AWS

Azure

Cloud

Docker

ETL

Google Cloud Platform

Python

Scala

Spark

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com