
Compliance • Healthcare Insurance • Education
Propelus is a company dedicated to modernizing the way professionals, employers, regulators, educators, and partners work together. It unifies three innovative brands—CE Broker, EverCheck, and Immuware—to provide comprehensive compliance and professional management solutions. Propelus empowers professionals with tools for career advancement, streamlines the compliance process, and offers a platform for professional management, continuing education, and primary source verification. Trusted by over 5 million professionals and organizations, Propelus has a mission to simplify compliance and enhance professional growth and success, particularly in the healthcare sector.
201 - 500 employees
Founded 2001
📋 Compliance
⚕️ Healthcare Insurance
📚 Education
October 31
Airflow
Amazon Redshift
AWS
Azure
BigQuery
Cloud
Docker
ETL
Hadoop
Java
Kafka
Kubernetes
MySQL
Oracle
Postgres
Python
Scala
Spark
SQL

Compliance • Healthcare Insurance • Education
Propelus is a company dedicated to modernizing the way professionals, employers, regulators, educators, and partners work together. It unifies three innovative brands—CE Broker, EverCheck, and Immuware—to provide comprehensive compliance and professional management solutions. Propelus empowers professionals with tools for career advancement, streamlines the compliance process, and offers a platform for professional management, continuing education, and primary source verification. Trusted by over 5 million professionals and organizations, Propelus has a mission to simplify compliance and enhance professional growth and success, particularly in the healthcare sector.
201 - 500 employees
Founded 2001
📋 Compliance
⚕️ Healthcare Insurance
📚 Education
• Design, develop, and maintain scalable and efficient data pipelines. • Implement data integration solutions to combine data from various sources into a cohesive data warehouse. • Develop ETL (Extract, Transform, Load) processes to transform raw data into useful formats for analysis. • Ensure data quality and integrity by implementing data validation and cleaning processes. • Optimize database performance through indexing, partitioning, and query optimization. • Manage and maintain data storage solutions, including data warehouses and data lakes. • Work closely with data scientists, analysts, and other stakeholders to understand data needs and requirements. • Collaborate with software engineers to integrate data solutions into applications and systems. • Create and maintain comprehensive documentation for data systems, processes, and workflows to ensure clarity and facilitate knowledge sharing. • Providing guidance and mentorship to junior data engineers, leading projects, and contributing to the strategic direction of data engineering initiatives. • Stay current with emerging technologies and industry trends to continuously improve data engineering practices. • Identify opportunities to enhance data infrastructure and implement best practices for data management. • Lead initiatives to automate and streamline data engineering processes. • Ensure data security and compliance with relevant regulations and standards. • Implement data governance policies to maintain data privacy and protection.
• Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. • 5+ years of experience in data engineering or a related role. • Proficiency in SQL and experience with database management systems (e.g., MySQL, PostgreSQL, Oracle). • Experience with big data technologies such as Hadoop, Spark, or Kafka. • Experience using data orchestration tools such as Airflow, Dagster, Prefect. • Familiarity with cloud platforms (e.g., AWS, Google Cloud, Azure) and related services (e.g., Redshift, BigQuery). • Strong programming skills in languages such as Python, Java, or Scala. • Experience with data modeling, data warehousing, and building ETL pipelines. • Knowledge of data governance, data security, and compliance practices. • Excellent problem-solving skills and attention to detail. • Strong communication and collaboration skills. • Experience with real-time data processing and stream analytics. • Advanced knowledge of Docker for containerizing applications and Kubernetes for orchestration. Experience managing containerized data processing workloads. • Knowledge of serverless architectures and services (e.g., AWS Lambda, Google Cloud Functions) and how to leverage them for data processing tasks. • Knowledge of CI/CD pipelines and DevOps practices. • Strong knowledge of data lineage tools to maintain transparency and traceability of data transformations.
• Awarded one of BuiltIn's 2025 Best Places to Work and honored as a Silver Stevie® Award Winner in the 2025 Stevie Awards For Great Employers. • Professional development allowance to help you grow in the ways that mean the most to you. • Flexibility for balancing work with the rest of life and ample PTO, including paid time off for volunteering, your birthday, and becoming a new parent. • 401K with company matching, as well as financial planning education and resources. • Employees can choose from HSA, FSA, and traditional insurance options for medical, dental, and vision coverage for themselves and dependents. • Lifestyle Spending Account (LSA): We support personal well-being by offering an annual lifestyle spending account that you can use for what matters most to you—whether it’s a gym membership, a meditation app, WFH equipment, or fresh produce delivered to your door. • Your health is our top priority! We cover 100% of your health insurance premiums. Our plans include national and international coverage, so you're protected no matter where you are. • Propelus Flex Club: Our flexible benefits platform gives you monthly points to redeem on what you need most. Plus, you'll get access to exclusive discounts just for being part of our team. • We've got you covered with a life insurance policy, paid 100% by the company. You can also add your beneficiaries at an exclusive, discounted rate.
Apply NowOctober 29
Data Engineer at Ekumen maintaining and enhancing data ingestion pipelines with Java and Python. Collaborating with data scientists and engineers to improve data flows and observability.
October 28
Senior Data Engineer managing big data solutions with a focus on ETL processes. Designing data architectures and collaborating with teams while working remotely.
🗣️🇪🇸 Spanish Required
October 28
Data Architect designing and implementing data architectures in a remote setting. Focused on ETL processes and data analysis with collaboration across teams.
🗣️🇪🇸 Spanish Required
October 28
MLOps Engineer in Colombia designing and maintaining data pipelines for risk models. Automating training processes and ensuring observability and monitoring of data quality.
🗣️🇪🇸 Spanish Required
October 23
Senior Data Engineer developing ETL pipelines to improve healthcare technology. Collaborating on data solutions for Fortune 500 companies in a remote-first environment.