DataOps Engineer

3 days ago

Apply Now
Logo of Nagarro

Nagarro

B2B ‱ Enterprise ‱ Technology Consulting

Nagarro is a global leader in digital engineering and technology consulting. The company helps clients become innovative, digital-first businesses by leveraging technology to drive business breakthroughs. Known for its entrepreneurial agility and CARING mindset, Nagarro offers a wide range of services, including digital engineering, intelligent enterprise solutions, and experience and design services. With over 17,900 employees across 37 countries, Nagarro collaborates with industry leaders to accelerate digitalization and technology-led innovation.

10,000+ employees

Founded 1996

đŸ€ B2B

🏱 Enterprise

📋 Description

‱ Manage and support data pipelines, ETL processes, and analytics platforms, ensuring reliability, accuracy, and accessibility ‱ Execute data validation, quality checks, and performance tuning using SQL and Python/Shell scripting ‱ Implement monitoring and observability using Datadog, Grafana, and Prometheus to track system health and performance ‱ Collaborate with DevOps and Infra teams to integrate data deployments within CI/CD pipelines (Jenkins, Azure DevOps, Git) ‱ Apply infrastructure-as-code principles (Terraform, Ansible) for provisioning and automation of data environments ‱ Support incident and request management via ServiceNow, ensuring SLA adherence and root cause analysis ‱ Work closely with security and compliance teams to maintain data governance and protection standards ‱ Participate in Agile ceremonies within Scrum/Kanban models to align with cross-functional delivery squads

🎯 Requirements

‱ 8+ years in DataOps, Data Engineering Operations, or Analytics Platform Support, with good exposure to DevOps/SRE practices ‱ Proficiency in SQL and Python/Shell scripting for automation and data diagnostics ‱ Experience with cloud platforms (AWS mandatory; exposure to Azure/GCP a plus) ‱ Familiarity with CI/CD tools (Jenkins, Azure DevOps), version control (Git), and IaC frameworks (Terraform, Ansible) ‱ Working knowledge of monitoring tools (Datadog, Grafana, Prometheus) ‱ Understanding of containerization (Docker, Kubernetes) concepts ‱ Strong grasp of data governance, observability, and quality frameworks ‱ Experience in incident management and operational metrics tracking (MTTR, uptime, latency)

đŸ–ïž Benefits

‱ Flexible work arrangements

Apply Now

Similar Jobs

November 20

Data Architect leading solutions using Azure and Databricks at a global software engineering company. Focused on data management and architecture for diverse clients in multiple sectors.

đŸ—ŁïžđŸ‡§đŸ‡·đŸ‡”đŸ‡č Portuguese Required

Azure

SQL

November 19

Senior Data Engineer building robust data pipelines and transformations for an AI platform. Collaborating with CTO, CPO, and product teams to impact business outcomes.

BigQuery

ETL

Google Cloud Platform

Postgres

Python

November 19

Data Engineering Practice Lead responsible for Microsoft Fabric architecture and strategy. Leading data migrations while mentoring the engineering team towards modern data practices.

October 29

Data Engineer focused on improving e-commerce analytics and insights within the Merchant Intelligence Team. Using BI tools, SQL and Python for real-time data solutions at Constructor.

Docker

Numpy

Pandas

PySpark

Python

Spark

SQL

October 27

Leading Data Engineering at GoGlobal to build and scale data infrastructure for informed decision-making. Drive data quality and strategic partnerships, fostering an inclusive team culture.

Airflow

AWS

Azure

BigQuery

Cloud

Distributed Systems

ETL

Google Cloud Platform

Java

Kafka

NoSQL

Python

Scala

Spark

SQL

Go

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com