Data Engineer – Apache Airflow Specialist

Job not on LinkedIn

November 4

Apply Now
Logo of Intetics

Intetics

Artificial Intelligence • Software Development • Enterprise

Intetics is an innovative company providing custom software development services, specializing in AI and machine learning solutions. They offer Remote In-Sourcing® to build expert teams for software engineering and data processing projects, along with advanced tools like TETRA™ for software quality assessment. Intetics aims to empower businesses by leveraging high-quality data and integrating modern technologies across various industries, including healthcare, finance, and more.

501 - 1000 employees

Founded 1995

🤖 Artificial Intelligence

🏢 Enterprise

📋 Description

• Develop, orchestrate, and maintain complex Apache Airflow DAGs for ETL and data-processing pipelines. • Build and optimize Python-based ETL scripts, integrating with Flask APIs when needed. • Design and manage Elasticsearch indexing and performance tuning workflows. • Handle Unix/Linux scripting and operations for automation and monitoring. • Work with Oracle and PostgreSQL databases for large-scale data processing. • Implement and maintain GitLab CI/CD pipelines for build, test, and deploy stages. • Collaborate with the project team to ensure scalability, reliability, and quality of data solutions.

🎯 Requirements

• ≥ 3 years of Apache Airflow DAG orchestration. • ≥ 5 years of Python (ETL focus), with Flask API experience as a plus. • ≥ 3 years of Elasticsearch (data indexing & optimization). • ≥ 3 years of Unix/Linux scripting & operations. • ≥ 3 years with Oracle or PostgreSQL (ideally both). • ≥ 3 years of GitLab pipelines (build/test/deploy). • Advanced English and a technical degree.

Apply Now

Similar Jobs

October 22

Data Engineer leading a small team to modernize a data platform into a cloud-based analytics project at Sigma Software.

Azure

ETL

Python

RDBMS

Spark

SQL

October 12

Senior Data Engineer designing and optimizing data solutions for analytics and AI at Valtech. Collaborating with teams on batch and streaming data pipelines in a dynamic environment.

Airflow

Apache

AWS

Azure

Google Cloud Platform

Python

Spark

SQL

October 12

Lead Data Engineer at Valtech designing and optimizing large-scale data solutions. Collaborating across technologies like Databricks, AWS, and GCP.

Airflow

Apache

AWS

Azure

Cloud

Google Cloud Platform

Python

Spark

SQL

Terraform

Unity

October 7

Senior Data Engineer developing and maintaining big data pipelines at Stellar for AdTech solutions. Collaborating with cross-functional teams on data architecture and governance.

Apache

AWS

Azure

Cloud

Google Cloud Platform

Hadoop

Kafka

NoSQL

PySpark

Python

Spark

SQL

September 11

Data Engineer building and optimizing ETL/ELT pipelines on GCP for SpinLab's slot-gaming analytics. Focus on scalability, FinOps, and productionizing ML workflows.

Airflow

AWS

BigQuery

Cloud

Docker

ETL

Google Cloud Platform

Kubernetes

Linux

Python

SQL

Terraform

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com