Data Engineer

Job not on LinkedIn

September 24

Apply Now
Logo of Prominence Advisors

Prominence Advisors

Healthcare Insurance • Artificial Intelligence • Enterprise

Prominence Advisors is a data enablement company that has been empowering health systems across the United States with self-service analytics since 2011. They integrate data from multiple sources in real-time, enabling large healthcare providers to maximize the value of their data. Prominence helps healthcare organizations leverage their data to make healthcare smarter, improve patient outcomes, reduce the cost of care, and increase operational efficiencies. Their solutions have saved clients millions of dollars and empowered healthcare leaders by building trust in their data through governed self-service analytics.

📋 Description

• Design, build, and maintain data pipelines and workflows • Transform raw sources into clean, reliable, and scalable data streams • Build and optimize data pipelines that ingest, transform, and deliver data from diverse sources (EHRs, claims, APIs, CRM) into analytics-ready structures • Guide customers through complex data challenges and partner with healthcare providers • Deliver high-quality results and teach/mentor counterparts to support/expand deliverables • Work in a fully remote team environment focused on healthcare IT strategy and analytics

🎯 Requirements

• 2–5+ years of professional experience in data engineering or related roles • Strong SQL skills, including query optimization and debugging • Proficiency in Python (or another programming/scripting language such as Scala or Java) • Hands-on experience with at least one of: Snowflake; Databricks; Azure Data Factory; AWS Redshift; Google BigQuery; dbt or similar transformation tools; Apache Airflow or other orchestration frameworks • Familiarity with ETL/ELT principles, data warehousing, and data modeling concepts • Experience with cloud services (AWS, Azure, or GCP) • Healthcare industry knowledge and experience (Epic, HL7, FHIR, claims) (desired) • Experience with CI/CD pipelines, Git, and DevOps workflows (desired) • Familiarity with Infrastructure-as-Code tools (Terraform, CloudFormation) (desired) • Experience with real-time/streaming data tools (Kafka, Kinesis, Pub/Sub) (desired) • Containerization experience (Docker, Kubernetes) (desired) • Cloud or data tool certifications (desired) • Full-time salaried role; no relocation required; candidates required to have a suitable home office to operate from

🏖️ Benefits

• Our 2 nodes of business: Analytics and Epic Services - offer you a diversified career path, stability in a rapidly changing market, and opportunities for growth within Prominence. • Prominence is a fully remote company, with no requirements on where you live or work within the US and flexibility to manage your schedule. • We offer 15 days PTO and up to 16 paid Holidays each year for full-time staff. • We offer a diverse healthcare offering, including low and high deductible health plans, HSAs, LTD/STD Insurance, Health and Dependent Savings Accounts, Vision, Dental, 401k offering, an annual Professional Development fund, and Signing Bonuses.

Apply Now

Similar Jobs

September 24

Lead and manage 5–8 data engineers with 50–60% hands-on coding, drive architecture, hiring, and delivery at Velir, a remote digital agency

Airflow

Apache

Azure

Cloud

September 24

Data Engineer at Mission Lane building scalable ETL/ELT pipelines with Python, dbt, and Snowflake on GCP. Collaborates with engineers, analysts, and data scientists to ensure reliable analytics.

Airflow

Cloud

ETL

Google Cloud Platform

Kubernetes

Python

SQL

Terraform

September 20

Build scalable ETL/ELT pipelines and data models for BI at Fusion Connect, ensuring data quality and cloud integration.

AWS

Azure

Cloud

ERP

ETL

Google Cloud Platform

Python

SQL

Tableau

September 20

Data Engineer building and maintaining Python/Spark pipelines and data quality for Veeva's OpenData reference datasets in the life sciences sector

Airflow

AWS

Cloud

Java

Python

Spark

SQL

September 17

Lead secure AWS IaC, CI/CD, containerization, and data pipelines for Via Logic serving DHS and other federal clients.

Ansible

AWS

Chef

Cloud

Cyber Security

Docker

DynamoDB

EC2

ETL

Java

Jenkins

Kubernetes

Microservices

Oracle

Postgres

Python

Redis

SQL

Tableau

Terraform

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com