
Healthcare Insurance • Fintech • Artificial Intelligence
Cotiviti is a healthcare technology and analytics company that specializes in improving payment accuracy and performance through advanced data analytics solutions. They partner with health plans, government agencies, and healthcare providers to deliver insights that enhance quality and efficiency in care delivery. With solutions such as risk adjustment, payment policy management, and member engagement, Cotiviti aims to optimize financial and clinical outcomes for the healthcare ecosystem.
5001 - 10000 employees
⚕️ Healthcare Insurance
💳 Fintech
🤖 Artificial Intelligence
October 4

Healthcare Insurance • Fintech • Artificial Intelligence
Cotiviti is a healthcare technology and analytics company that specializes in improving payment accuracy and performance through advanced data analytics solutions. They partner with health plans, government agencies, and healthcare providers to deliver insights that enhance quality and efficiency in care delivery. With solutions such as risk adjustment, payment policy management, and member engagement, Cotiviti aims to optimize financial and clinical outcomes for the healthcare ecosystem.
5001 - 10000 employees
⚕️ Healthcare Insurance
💳 Fintech
🤖 Artificial Intelligence
• Implement data storage solutions and architectures for AI/ML workflows. • Develop integration patterns between various data platforms and systems. • Build scalable data processing pipelines using distributed computing frameworks. • Create data quality and validation frameworks for ML pipelines. • Design and implement ETL processes for model training data preparation. • Support performance monitoring and optimization of data infrastructure. • Implement security configurations and access controls for data platforms. • Develop data lineage tracking and metadata management solutions. • Build event streaming integrations for real-time data processing. • Create automated testing frameworks for data pipeline validation. • Document technical implementations and integration patterns. • Collaborate with data scientists on data access and processing requirements.
• Bachelor’s degree in computer science or related field. • 4+ years of software engineering experience with data platform focus. • Strong knowledge of database systems and data storage architectures. • Experience with distributed data processing frameworks. • Proficiency in data pipeline design and ETL development. • Understanding of event streaming systems and real-time processing. • Experience with bulk data manipulation and optimization techniques. • Strong programming skills in Python, Java, or Scala. • Knowledge of data security and access control patterns. • Experience with cloud data platforms and services. • Understanding of data quality and validation methodologies.
• Competitive salary • Flexible working hours • Professional development budget • Home office setup allowance • Global team events
Apply NowOctober 2
Senior Software Engineer developing scalable microservices in Python and Golang at OutSystems. Collaborating on AWS environments, Kubernetes workloads, and ensuring secure development practices.
September 30
Lead and optimize SQL Server ETL and Azure data pipelines for Veradigm's VEHR. Design scalable analytics, manage 3,000+ databases, and support compliance reporting.
September 30
Build and optimize Sumo Logic's query-engine microservices processing petabytes of logs; own services, implement algorithms, TDD, and on-call responsibilities.
September 29
Software Development Engineer II building full-stack growth experiments, retention and monetization features for QuillBot within Learneo's platform.
September 28
51 - 200
Backend engineer expanding query capabilities and ingestion for Imply Lumi observability warehouse. Focus on query performance, scalability, and reliability.