
Fintech • Insurance • SaaS
Bestow is a leading life insurance SaaS platform provider that offers innovative technology solutions for life and annuity insurance carriers. By providing tools such as digital applications, customizable underwriting engines, and comprehensive customer portals, Bestow helps streamline the entire insurance process from quoting to policy administration. The company focuses on enhancing the speed and efficiency of insurance operations, enabling carriers to launch new products swiftly and optimize their customer service capabilities.
51 - 200 employees
Founded 2017
💳 Fintech
☁️ SaaS
October 28
Airflow
Amazon Redshift
Apache
AWS
Azure
BigQuery
Cloud
Docker
Google Cloud Platform
GraphQL
GRPC
Python
SQL
Terraform

Fintech • Insurance • SaaS
Bestow is a leading life insurance SaaS platform provider that offers innovative technology solutions for life and annuity insurance carriers. By providing tools such as digital applications, customizable underwriting engines, and comprehensive customer portals, Bestow helps streamline the entire insurance process from quoting to policy administration. The company focuses on enhancing the speed and efficiency of insurance operations, enabling carriers to launch new products swiftly and optimize their customer service capabilities.
51 - 200 employees
Founded 2017
💳 Fintech
☁️ SaaS
• Define and drive the technical roadmap for data infrastructure, establishing architectural patterns and standards that scale across the organization • Lead the design and implementation of complex, multi-system data architectures that support business-critical operations and enable innovation (data ingestion + export and delivery) • Evaluate and champion adoption of emerging technologies and best practices in data engineering, MLOps, and GenAI • Establish data governance frameworks, quality standards, and operational excellence practices across all data workloads • Drive cross-functional initiatives that require coordination between data, product, engineering, and business teams • Architect enterprise-scale data solutions for transferring data from first and third-party applications to and from our data warehouse • Design and oversee the development of robust, scalable APIs (REST, GraphQL, gRPC) that enable data access for internal teams and external partners • Lead the evolution of event-driven and API-first data architectures that support real-time data sharing and integration • Leverage Google Cloud (GCP) tools (Cloud Run, Cloud Functions, Vertex AI, App Engine, Cloud Storage, IAM, etc.) and services (Astronomer - Apache Airflow) to architect and bring enterprise data workloads to production • Design resilient, self-healing data systems with comprehensive monitoring, alerting, and automated remediation - and participating as part of an on-call rotation. • Lead the evolution of our data platform on Google Cloud (GCP), leveraging advanced services and optimizing for cost, performance, and reliability • Define patterns for streaming and batch data architectures that serve diverse use cases • Establish best practices for data contracts, API versioning, CI/CD, documentation, and partner integrations • Lead MLOps strategy and implementation, establishing patterns for model deployment, monitoring, and governance at scale • Architect and oversee Generative AI infrastructure, enabling rapid prototyping while ensuring enterprise-grade security, compliance, and cost management • Partner with Data Science leadership to translate research initiatives into production-ready solutions • Drive innovation in AI/ML tooling and infrastructure, staying ahead of industry trends • Mentor and guide Data Engineers at all levels, conducting design reviews and providing technical feedback • Establish engineering standards, documentation practices, and knowledge-sharing processes • Participate in hiring and onboarding processes, helping to build a world-class data engineering team • Foster a culture of engineering excellence, experimentation, and continuous improvement • Partner with product, engineering, and business leaders to align data strategy with organizational goals • Communicate complex technical concepts to non-technical stakeholders, building alignment and driving informed decision-making • Represent data engineering in cross-functional planning and architecture forums • Build strong relationships with external partners and vendors
• 10+ years working in a data engineering role that supports incoming/outgoing feeds as well as analytics and data science teams • 5+ years of advanced Airflow and Python experience writing production-grade, efficient, testable, and maintainable code • 3+ years of experience designing, building, and maintaining production APIs (REST, GraphQL, gRPC) for data access and integration, including API gateway management, rate limiting, authentication/authorization, and versioning strategies • 3+ years leading ML/MLOps initiatives, including model deployment, monitoring, and governance at scale • 3+ years of hands-on experience with Google Cloud Platform (GCP) including Cloud Run, Cloud Functions, Vertex AI, Cloud Storage, IAM, and other core services. • Deep expertise with columnar databases (BigQuery, Snowflake, Redshift) and advanced SQL optimization techniques. • Demonstrated experience with AI Coding assistants – AI tools are heavily engrained in Bestow culture. • Proven track record designing an end-to-end data pipeline in cloud frameworks (such as GCP, AWS, Azure) with requirements from multiple stakeholders • Experience with upstream data coordination through data contracts. • Experience building CICD pipelines for data processing using tools such as Docker, CircleCI, dbt, git, etc • Extensive experience with infrastructure as code (Terraform, Pulumi) and GitOps practices • Expert level knowledge of data orchestration frameworks such as Apache Airflow (or similar) to manage SLOs and processing dependencies • Experience in building streaming / real-time ingestion pipelines • Experience with creating alerts and monitoring pipelines which contribute to overall data governance. • Experience with containerization and container orchestration technologies with cloud architecture and implementation features (single- and multi-tenancy, orchestration, elastic scalability) • Deep understanding of standard IT security practices such as identity and access management (IAM), data protection, encryption, certificate, and key management. • Adaptability to learn new technologies and products as the job demands. • Proven ability to mentor engineers and lead technical initiatives across teams • Nice to have: Familiarity with building tools that draw upon Generative AI (GenAI) integrations (Enterprise-grade, not simply vibe-coded).
• Competitive salary and equity based on role • Policies and managers that support work/life balance, like our flexible paid time off and parental leave programs • 100% paid-premium option for medical, dental, and vision insurance • Lifestyle stipend to support your physical, emotional, and financial wellbeing • Flexible work-from-home policy and open to remote • Remote and WFH options, as well as a beautiful, state-of-the-art office in Dallas’ Deep Ellum, for those who prefer an office setting • Employee-led diversity, equity, and inclusion initiatives
Apply NowOctober 28
Clinical Data Architect managing data architecture and clinical data integration for healthcare solutions. Collaborating with teams to ensure integrity and quality of ophthalmology data.
🇺🇸 United States – Remote
💵 $141k - $235k / year
⏰ Full Time
🟠 Senior
🔴 Lead
🚰 Data Engineer
🦅 H1B Visa Sponsor
October 26
Director of Data Engineering managing Workiva’s multi-tenant data platform in the US and Europe. Driving strategy and execution of internal and external data products with measurable customer value.
October 24
Principal Data Engineer specializing in database architecture and optimization for insightsoftware. Leading modernization efforts for database systems to enhance performance and integration.
🇺🇸 United States – Remote
💰 Private Equity Round on 2021-07
⏰ Full Time
🔴 Lead
🚰 Data Engineer
🦅 H1B Visa Sponsor
October 22
Principal Data Engineer at Trella Health designing complex data models and pipelines for healthcare products. Collaborating closely with Data Scientists and mentoring team members for quality assurance.
October 22
Manager of Data Engineering leading a team for DexCare, a digital care orchestration platform. Building scalable data pipelines and ensuring data quality in a regulated healthcare environment.
🇺🇸 United States – Remote
💵 $143k - $193k / year
💰 $50M Series B on 2022-01
⏰ Full Time
🟠 Senior
🔴 Lead
🚰 Data Engineer
🦅 H1B Visa Sponsor