
SaaS âą Healthcare Insurance
Arbiter is a care orchestration platform that unifies patients, payers, and providers on a single intelligent system to automate referrals, prior authorizations, scheduling, and care optimization. The modular SaaS embeds into existing clinical workflows, consolidates clinical, financial, and policy data, and uses AI-driven routing and automation to improve referral matching, reduce administrative burden, and ensure timely, cost-effective care. Arbiter is built by industry operators and integrates with EMRs and payer systems to deliver closed-loop execution across the care continuum.
November 25

SaaS âą Healthcare Insurance
Arbiter is a care orchestration platform that unifies patients, payers, and providers on a single intelligent system to automate referrals, prior authorizations, scheduling, and care optimization. The modular SaaS embeds into existing clinical workflows, consolidates clinical, financial, and policy data, and uses AI-driven routing and automation to improve referral matching, reduce administrative burden, and ensure timely, cost-effective care. Arbiter is built by industry operators and integrates with EMRs and payer systems to deliver closed-loop execution across the care continuum.
âą Architect & Build: Design, develop, and maintain robust, scalable, and high-performance data processing systems (batch and/or real-time streaming) that power critical business functions, AI agents, and advanced analytics. âą Technical Leadership: Lead complex data engineering initiatives from conception to deployment, ensuring data pipelines are reliable, efficient, testable, maintainable, and adhere to best practices for data ingestion from EMRs, claims, payor files, and payer policies. âą Data Modeling & Governance: Drive the design of our enterprise data models for optimal storage, retrieval, and analytical performance, ensuring alignment with product, business, and regulatory requirements, including tracking RAF performance and HCC code detection. âą Platform & Tooling: Champion and contribute to the development of core data platform tooling, frameworks, and standards that enhance developer productivity and data quality across the organization, supporting our AI agents and auditable systems. âą Cross-Functional Collaboration: Partner closely with product managers, data scientists, software engineers, and other non-technical stakeholders to understand data needs, deliver impactful solutions, and provide expert data insights that drive the intelligent operating system. âą Mentorship & Growth: Actively participate in mentoring junior data engineers, contributing to our team's growth through technical guidance, code reviews, and knowledge sharing. âą Hiring & Onboarding: Play an active role in interviewing and onboarding new team members, helping to build a world-class data engineering organization.
âą 8+ years of deep, hands-on experience in Data Engineering, Data Infrastructure, or building Data Engineering Tools and Frameworks, ideally within a high-growth tech environment. âą Exceptional expertise in data structures, algorithms, and distributed systems. âą Mastery in Python for large-scale data processing; experience with other languages like Java or Scala is a plus. âą Extensive experience designing, building, and optimizing complex, fault-tolerant data pipelines (both batch and real-time streaming). âą Profound understanding and hands-on experience with cloud-native data platforms, especially Google Cloud Platform (GCP) services like BigQuery, Dataflow, Composer, Dataproc. âą Demonstrated experience with modern data orchestration (e.g., Airflow), data transformation (dbt), and data warehousing concepts. âą Intimate knowledge of and ability to implement unit, integration, and functional testing strategies. âą Experience providing technical leadership and guidance, and thinking strategically and analytically to solve problems. âą Friendly communication skills and ability to work well in a diverse team setting. âą Demonstrated experience working with many cross-functional partners. âą Demonstrated experience leading a software product or component vision and delivery plan.
âą Highly Competitive Salary & Equity Package: Designed to rival top FAANG compensation, including meaningful equity. âą Generous Paid Time Off (PTO): To ensure a healthy work-life balance. âą Comprehensive Health, Vision, and Dental Insurance: Robust coverage for you and your family. âą Life and Disability Insurance: Providing financial security. âą Simple IRA Matching: To support your long-term financial goals. âą Professional Development Budget: Support for conferences, courses, and certifications to fuel your continuous learning. âą Wellness Programs: Initiatives to support your physical and mental health.
Apply NowNovember 25
Senior Data Engineer working on data pipelines to process billions of events for Button's mobile growth. Collaborating with data scientists, ML, and infrastructure teams for robust data solutions.
Airflow
Apache
AWS
BigQuery
Cloud
DynamoDB
Google Cloud Platform
MySQL
Postgres
Python
Redis
SQL
Terraform
November 25
Engenheiro de Dados SR implementando e monitorando pipelines de dados. Empresa oferece soluçÔes de tecnologia disruptivas com foco em Robotização, InteligĂȘncia Artificial e Analytics.
đŁïžđ§đ·đ”đč Portuguese Required
Airflow
Cloud
ETL
Google Cloud Platform
NoSQL
Python
Spark
SQL
November 25
Software Engineer building scalable data products and APIs for Demandbaseâs B2B data platform. Collaborating with teams to enhance data-driven products.
đșđž United States â Remote
đ” $151k - $227k / year
â° Full Time
đĄ Mid-level
đ Senior
đ° Data Engineer
AWS
BigQuery
Cloud
ETL
Google Cloud Platform
Java
Kafka
Postgres
Python
React
Scala
TypeScript
Go
November 25
Senior Data Engineer improving real-world entity identification datasets for Demandbase's account-based GTM strategies. Leading initiatives and collaborating on machine learning model development within a dynamic engineering team.
Airflow
Apache
AWS
EC2
Java
Scala
SDLC
Spark
SQL
November 25
Senior Data Engineer at People Data Labs building scalable data solutions and infrastructure using modern data tools and technologies.
Airflow
Amazon Redshift
Apache
AWS
Azure
BigQuery
Cloud
Google Cloud Platform
Java
Python
Scala
Spark
SQL