
Healthcare Insurance • Fintech • Artificial Intelligence
Cotiviti is a healthcare technology and analytics company that specializes in improving payment accuracy and performance through advanced data analytics solutions. They partner with health plans, government agencies, and healthcare providers to deliver insights that enhance quality and efficiency in care delivery. With solutions such as risk adjustment, payment policy management, and member engagement, Cotiviti aims to optimize financial and clinical outcomes for the healthcare ecosystem.
November 14
🗣️🇪🇸 Spanish Required

Healthcare Insurance • Fintech • Artificial Intelligence
Cotiviti is a healthcare technology and analytics company that specializes in improving payment accuracy and performance through advanced data analytics solutions. They partner with health plans, government agencies, and healthcare providers to deliver insights that enhance quality and efficiency in care delivery. With solutions such as risk adjustment, payment policy management, and member engagement, Cotiviti aims to optimize financial and clinical outcomes for the healthcare ecosystem.
• Design, build, and maintain efficient, scalable, and secure ETL pipelines to support data science and machine learning workloads • Optimize data flow and collection processes for both batch and real-time systems • Automate data ingestion, transformation, and integration workflows to support advanced analytics and ML pipelines • Monitor and troubleshoot pipeline issues to ensure reliability and scalability • Work with various databases, data warehouses, and cloud storage systems • Implement best practices in database architecture, performance tuning, security, and cost efficiency • Ensure the accuracy, consistency, and reliability of data across pipelines • Collaborate with team members to identify, document, and resolve data issues • Build and support pipelines for model retraining, performance tracking, and feature engineering • Mentor junior engineers and promote a culture of engineering excellence and peer learning
• Bachelor’s degree in data engineering, big data, data analytics/science, computer science or other quantitative fields • Minimum of 5 years of relevant experience • Proven experience working with big data tools and platforms such as Spark, Hadoop, Oracle, or AWS S3 • Advanced proficiency in SQL, with experience in databases like SQL Server, MySQL, or Oracle • Strong understanding of data modeling for ML, including feature store management and serving strategies • Hands-on experience in building ETL pipelines, data warehousing solutions, and integrating analytic tools • Experience with MLOps tools such as MLflow, Airflow, or Kubeflow is a plus • Fully bilingual in English and Spanish (written and verbal)
• Competitive benefits package to address a wide range of personal and family needs.
Apply NowNovember 7
Design and automate essential data pipelines ensuring seamless integration for Nielsen's analytics. Collaborate with teams and uphold data integrity throughout its lifecycle.
Airflow
AWS
Cloud
EC2
PySpark
Python
SDLC
SQL
Tableau
November 6
GCP Data Engineer in remote role for a leading data solutions company. Responsible for implementing data architectures and ETL processes with a focus on Google Cloud Platform.
🗣️🇪🇸 Spanish Required
Apache
BigQuery
Cloud
ETL
Google Cloud Platform
PySpark
Python
Spark
SQL
November 4
Data Engineer developing and maintaining data pipelines for a global agile consultancy. Utilizing Modern Data Stack with expertise in Snowflake and Azure Data Factory.
🇲🇽 Mexico – Remote
💵 $50k - $65k / month
💰 Post-IPO Equity on 2007-03
⏰ Full Time
🟡 Mid-level
🟠 Senior
🚰 Data Engineer
🗣️🇪🇸 Spanish Required
Azure
ETL
ITSM
Python
SDLC
ServiceNow
SQL
November 4
Data Engineer at Inetum designing and operating large-scale petabyte data systems. Building real-time data pipelines and collaborating in agile environments to deliver scalable solutions.
🗣️🇪🇸 Spanish Required
Airflow
AWS
Azure
Cassandra
Google Cloud Platform
Hadoop
HBase
Java
Kafka
Oracle
Python
Spark
SQL
November 1
Data Engineer focusing on large-scale data systems operations and real-time data pipelines at Inetum. Collaborating with engineers and product managers to build robust technical solutions.
🗣️🇪🇸 Spanish Required
Airflow
Azure
Cassandra
Cloud
Distributed Systems
Google Cloud Platform
Hadoop
HBase
Java
Kafka
Oracle
Python
Spark
SQL