
Finance • Fintech • SaaS
dv01 is a data management and analytics platform that serves as a crucial link between lenders and capital markets. It specializes in providing standardized loan-level data and integrated analytics tools to facilitate easier access and analysis of structured finance products. The platform supports various asset classes, including consumer unsecured loans, mortgages, auto loans, and student loans, and offers insights through features like ESG Data Enrichment and Portfolio Surveillance. dv01's offerings help transform raw, error-laden data into trustworthy, actionable insights, aiding investment banks, hedge funds, asset managers, and institutional investors in making smarter, data-driven financial decisions. The company addresses outdated technologies in the structured products market, offering a modern, cloud-based solution to enhance data integrity and optimize financial strategies.
August 13
🇺🇸 United States – Remote
💵 $120k - $135k / year
⏰ Full Time
🟢 Junior
🟡 Mid-level
📊 Analytics Engineer
🚫👨🎓 No degree required
🦅 H1B Visa Sponsor

Finance • Fintech • SaaS
dv01 is a data management and analytics platform that serves as a crucial link between lenders and capital markets. It specializes in providing standardized loan-level data and integrated analytics tools to facilitate easier access and analysis of structured finance products. The platform supports various asset classes, including consumer unsecured loans, mortgages, auto loans, and student loans, and offers insights through features like ESG Data Enrichment and Portfolio Surveillance. dv01's offerings help transform raw, error-laden data into trustworthy, actionable insights, aiding investment banks, hedge funds, asset managers, and institutional investors in making smarter, data-driven financial decisions. The company addresses outdated technologies in the structured products market, offering a modern, cloud-based solution to enhance data integrity and optimize financial strategies.
• Be at the heart of dv01. You’ll operate as the bridge between the engineering and analyst teams, contributing to a variety of integral processes that drive dv01 on a daily basis. Every new dataset that gets integrated within dv01 will have your fingerprints all over it. • Be an owner of dv01's most valuable asset. You’ll own managing and scaling the business logic in our data pipeline, encapsulating all the knowledge we’ve accumulated across hundreds of datasets. The output from the pipeline powers all of dv01's customer offerings and is critical to the success of our business. • Work directly with internal and external stakeholders. You’ll work with our team of analyst experts as well as directly connecting with customers to understand the complexities of the data dv01 manages. • Work with state-of-the-art technology. You’ll work with popular, modern, and exciting open source technologies like DBT, Airflow and Apache Spark. Everything we do is cloud-native on Google Cloud Platform, utilizing tools including BigQuery, Cloud Run and Dataproc. The skills you develop here will serve you well beyond dv01.
• A well-rounded data engineer. You have 3+ years of professional experience writing production-ready code in a language such as Python, DBT, Scala or Java as well as being highly proficient in SQL. You are able to write well thought-out code while accounting for resource and performance constraints. • Passionate about working with data. You should have 2+ years of professional experience working directly with data pipelines with exposure to datasets related to loans an added plus. Your typical day-to-day involves tasks such as wrangling messy raw data into a usable state, expressing complex business logic as code, and configuring new and debugging existing pipelines. • Knowledgeable about relational data concepts. You are able to understand and explain the relationship between various tables and how and why the needs of stakeholders have been captured in a particular data model. • Excited about big data tools. You work frequently with processing frameworks and databases designed to handle large datasets and have spent time optimizing how you store and compute data within them. • Interested and experienced in both engineering and finance. You8;re looking to grow your skills in both disciplines and are excited about the synergies between finance and technology. You’re capable of understanding how investors evaluate loan portfolios and the complexities of amortization, prepay, and default. • A first-rate collaborator and communicator. You’re comfortable working alongside analysts and subject matter experts and translating their requirements into code. You thrive on interacting with clients to best understand and satisfy their needs.
• Unlimited PTO • $1,000 Learning & Development Fund • Remote-First Environment. • Health Care and Financial Planning. • Stay active your way! Get $138/month to put toward your favorite gym or fitness membership — wherever you like to work out. Prefer to exercise at home? You can also use up to $1,650 per year through our Fitness Fund to purchase workout equipment, gear, or other wellness essentials. • New Family Bonding. Primary caregivers can take 12 weeks off 100% paid leave, while secondary caregivers can take 3 weeks. Returning to work after bringing home a new child isn’t easy, which is why we’re flexible and empathetic to the needs of new parents.
Apply NowJuly 18
Log Analytics Engineer role at American casino and resort company, focusing on log data analysis.
AWS
Azure
ElasticSearch
Logstash
Python
Splunk
SQL
June 10
Multi Media LLC seeks an Analytics Engineer to create impactful data models and empower analytics.
Airflow
AWS
Azure
Cloud
ETL
Google Cloud Platform
Python
SQL
Tableau
April 15
11 - 50
Join Owner.com as a GTM Analytics Engineer to build scalable data solutions for restaurants.
ETL
Python
SFDC
SQL
April 4
Airflow
Amazon Redshift
Apache
AWS
BigQuery
Docker
ETL
Hadoop
Kafka
Kubernetes
Python
PyTorch
Spark
Tableau
Tensorflow
March 24
Join MojoTech as a Data & Analytics Engineer, collaborating on data solutions for clients.
Airflow
Amazon Redshift
Apache
AWS
BigQuery
Docker
ETL
Hadoop
Kafka
Kubernetes
PyTorch
Spark
Tableau
Tensorflow