
Artificial Intelligence • Finance • Fintech
Saaf Finance is a company that specializes in automating loan origination using AI and advanced technologies. Their platform integrates with partners' data to provide a seamless digital experience for consumers and lenders, focusing on efficiency, accuracy, and security. Saaf's solutions are tailored for mortgage products including Home Equity Loans and First Mortgages, providing customizable underwriting, compliance, and integrated services. The company aims to enhance revenue growth, operational efficiency, and competitive edge for lenders while offering enhanced security and digital packaging for the secondary market.
August 28

Artificial Intelligence • Finance • Fintech
Saaf Finance is a company that specializes in automating loan origination using AI and advanced technologies. Their platform integrates with partners' data to provide a seamless digital experience for consumers and lenders, focusing on efficiency, accuracy, and security. Saaf's solutions are tailored for mortgage products including Home Equity Loans and First Mortgages, providing customizable underwriting, compliance, and integrated services. The company aims to enhance revenue growth, operational efficiency, and competitive edge for lenders while offering enhanced security and digital packaging for the secondary market.
• Saaf Finance is building the AI workforce for the mortgage industry, reimagining how loans are underwritten and processed. Backed by leading financial institutions, we’re scaling fast and looking for a Data Engineer who thrives on solving complex data challenges and wants to shape the infrastructure behind one of the most data-intensive industries in the world. • As a Data Engineer at Saaf, you’ll own the backbone of our AI-driven platform: the data. From borrower interactions to loan performance metrics, every workflow we build depends on data being accurate, reliable, and available in real-time. This is a hands-on role where you’ll work closely with engineering, product, and data teams to design and operate production-grade infrastructure that powers analytics, product features, and next-generation agentic automation in mortgage origination. • This role is perfect for someone who enjoys building from scratch, wants to push the boundaries of how data pipelines can support AI systems, and is excited to reimagine financial workflows at scale. • Key Responsibilities • Data Pipeline Development – Design, implement, and maintain ETL/ELT pipelines for structured and unstructured datasets from internal and external sources. • Data Warehousing – Build and optimize warehouses and marts (Snowflake, BigQuery, or similar) for analytics, reporting, and product use cases. • Integration – Ingest data from APIs, SaaS platforms such as CRM and financial data APIs, and internal systems into the core data platform. • Data Modeling: Design, implement, and maintain conceptual, logical, and physical data models to ensure scalable, consistent, and high-quality datasets for downstream analytics and applications • Data Quality and Governance – Implement validation, schema management, and robust documentation to ensure data accuracy and compliance. • Performance Optimization – Monitor and fine-tune pipeline and warehouse performance for scalability and cost efficiency. • Security and Compliance – Apply data security and privacy controls aligned with financial regulatory requirements, ensuring full traceability of every transformation. • Analytics Enablement – Provide clean, consistent datasets for analysts, product managers, and operational teams to support fast, data-driven decisions.
• Technical Expertise • Strong SQL and Python development skills for data transformation and automation. • Experience with modern ETL/ELT frameworks such as dbt. • Proficiency with cloud platforms (AWS preferred) and serverless data services. • Strong experience with data warehouse technologies (Snowflake preferred). • Skilled in API integrations and ingestion from third-party systems. • Data Operations • Proficient in data modeling (Kimball/Star schema, Data Vault). • Experience implementing CI/CD practices for data workflows. • Ability to set up logging, monitoring, and alerting for data jobs. • Bonus Skills • Experience building agentic workflows and orchestrating multi-step automated processes that act on data in real time. • Familiarity with data engineering patterns and infrastructure required for the recent wave of AI-powered tools and automation platforms. • Experience working with financial datasets and APIs in a high-compliance environment. • Understanding of data privacy regulations such as GDPR and CCPA. • Qualifications • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. • 3+ years in a data engineering or similar backend data-focused role. • Proven track record of delivering production-grade data pipelines at scale. • Experience collaborating closely with product managers, data scientists, and full stack engineers. • Startup mindset: hands-on, resourceful, and comfortable operating in a fast-paced environment.
• Competitive salary • High ownership from day one — your work will directly shape core systems and products • Fast-paced environment with quick decision cycles and minimal bureaucracy • Remote-first team with flexibility on work hours and location • Direct access to founders and cross-functional teams — no layers, no silos • Clear expectations, regular feedback, and support for professional growth • Work on real problems in a complex, high-impact industry
Apply NowAugust 27
Senior Data Engineer building and operating scalable ELT pipelines for Liven, a hospitality technology provider. Driving data reliability and analytics across global venues.
Airflow
Apache
Cloud
Kafka
Python
SQL
Terraform
August 25
Big Data Engineer designing and optimizing terabyte-scale pipelines for BrightEdge's SEO platform.
Airflow
Distributed Systems
Docker
ETL
Hadoop
Kubernetes
Microservices
NoSQL
Python
Spark
SQL
August 16
Sr. Big Data Engineer for Databricks Professional Services; leads end-to-end big data and AI projects for clients. Mentors teams and designs architectures.
Apache
AWS
Azure
Bootstrap
Cloud
Google Cloud Platform
Kafka
Python
Scala
Spark
August 15
Data Engineer and Data Scientist shaping ETL pipelines and analytics at OpenFX... enables data-driven growth for cross-border payments.
AWS
BigQuery
Cloud
ETL
Google Cloud Platform
SQL
August 11
Design and maintain Databricks pipelines at V4C.ai; implement Delta Lake, Spark. Collaborate with analysts and stakeholders.
Airflow
Apache
AWS
Azure
Cloud
ETL
Google Cloud Platform
Prometheus
Python
Scala
Spark
SQL