Data Ops Engineer

Job not on LinkedIn

4 hours ago

Apply Now
Logo of McKesson

McKesson

Healthcare Insurance • Pharmaceuticals • Biotechnology

McKesson is a diversified healthcare leader specializing in pharmaceutical distribution, medical supplies, and healthcare services. Their solutions facilitate patients' access to life-changing therapies, support efficient operations for pharmacies, health systems, and clinics, and address critical issues such as drug shortages through a resilient supply chain. McKesson provides comprehensive services such as pharmacy management software, consulting, and technology solutions for specialty practices, focusing on improving health outcomes and advancing the pharmaceutical industry. They work closely with biopharma companies to enhance medication access, adherence, and commercialization while supporting oncology and biopharma practices through data-driven insights and real-world evidence.

📋 Description

• Design and implement scalable and efficient data pipelines to support various data-driven initiatives. • Collaborate with cross-functional teams to understand data requirements and contribute to the development of data architecture. • Work on data integration projects ensuring seamless and optimized data flow between systems. • Implement best practices for data engineering ensuring data quality, reliability, and performance. • Contribute to data modernization efforts by leveraging cloud solutions and optimizing data processing workflows. • Create and maintain technical documentation including data mapping documents and data dictionaries. • Effectively communicate technical concepts to both technical and non-technical stakeholders. • Automation and promotions to different environments using GitHub CICD with GitHub Actions/Liquibase.

🎯 Requirements

• 5+ years of experience in data engineering. • Bachelor's degree in a related field (e.g., Computer Science, Information Technology, Data Science). • Technical expertise in building and optimizing data pipelines and large-scale processing systems. • Technical expertise with Azure Cloud, Data Factory, Batch Service, and Databricks. • Experience with cloud solutions and contributing to data modernization efforts. • Experience using Terraform and bicep scripts to build Azure infrastructure. • Experience implementing security changes using Azure RBAC. • Experience building cloud infrastructure including Data Factory, Batch Service, Azure Gen 2 Storage Account, and Azure SQL database. • Experience developing data pipelines through programming languages such as SQL, Python, Pyspark, or Scala. • Strong problem-solving skills and attention to detail.

🏖️ Benefits

• Comprehensive benefits to support physical, mental, and financial well-being. • Total Rewards package includes healthcare and wellness programs.

Apply Now

Similar Jobs

5 hours ago

Senior Specialist of Self-Pay Operations managing collections and improving billing strategies for a leading revenue cycle management provider. Responsibilities include overseeing accounts receivable and analyzing performance.

🇺🇸 United States – Remote

💵 $17 - $19 / hour

💰 Private Equity Round on 2022-03

⏰ Full Time

🟠 Senior

⚙️ Operations

5 hours ago

Associate Director overseeing day-to-day operations of Celcuity's field teams across Sales, Market Access, Marketing, and Medical Affairs. Implementing and managing CRM systems and field enablement programs.

6 hours ago

Operations Coordinator responsible for HR, benefits, and finance operations at nonprofit. Collaborating with COO to maintain organizational functions and streamline processes across departments.

8 hours ago

Lead Operations Engineer focusing on troubleshooting VoIP issues and supporting operational excellence for Brightspeed. Ensuring the reliability of voice services and collaborating with various teams.

8 hours ago

Associate Manager at DoorDash implementing fraud prevention strategies and customer incentives. Collaborating across teams for business impact and leveraging data-driven decision-making.