Senior Data Engineer

Job not on LinkedIn

October 22

Apply Now
Logo of MeridianLink

MeridianLink

Fintech • Banking • SaaS

MeridianLink is a leading provider of SaaS solutions for financial institutions, specializing in loan origination systems and digital transformation technologies. Their end-to-end platform enhances digital experiences through integration with mortgage LOS, deposit account opening solutions, and more. MeridianLink's cloud-based systems improve efficiency in loan processing and collections, data-driven decision-making, and account management. The company collaborates with partners to expand market reach and drive growth in the fintech industry. With over 25 years of experience, MeridianLink is dedicated to supporting banks, credit unions, and other financial service providers through technology and business intelligence.

501 - 1000 employees

Founded 1998

💳 Fintech

🏦 Banking

☁️ SaaS

💰 $485M Post-IPO Debt on 2021-11

📋 Description

• The Senior Data Engineer will design, build, implement, and maintain data processing pipelines for the extraction, transformation, and loading (ETL) of data from a variety of data sources. • Expected to lead the writing of complex SQL queries to support analytics needs. • Responsible for developing technical tools and programming that leverage artificial intelligence, machine learning, and big-data techniques to cleanse, organize and transform data and to maintain, defend and update data structures and integrity on an automated basis. • Senior Data Engineers will evaluate and recommend tools and technologies for data infrastructure and processing. Collaborate with engineers, data scientists, data analysts, product teams, and other stakeholders to translate business requirements to technical specifications and coded data pipelines. • The role will work with tools, languages, data processing frameworks, and databases such as R, Python, SQL, Databricks, Spark, Delta, APIs. Work with structured and unstructured data from a variety of data stores, such as data lakes, relational database management systems, and/or data warehouses.

🎯 Requirements

• Bachelor's or master's degree in computer science, Engineering, or a related field. • 6-8 years of experience in data engineering, with a strong focus on financial systems on SaaS platforms. • Deep expertise in Python, SQL, and distributed processing frameworks like Apache Spark, Databricks, Snowflake, Redshift, BigQuery. • Proven experience with cloud-based data platforms (preferably AWS or Azure). • Hands-on experience with data orchestration tools (e.g., Airflow, dbt) and data warehouses (e.g., Databricks, Snowflake, Redshift, BigQuery). • Strong understanding of data security, privacy, and compliance within a financial services context. • Experience working with structured and semi-structured data (e.g., Delta, JSON, Parquet, Avro) at scale. • Familiarity with modelling datasets in Salesforce, Netsuite and Anaplan to solve business use cases required. • Previous experience Democratizing data at scale for the enterprise a huge plus.

🏖️ Benefits

• Insurance coverage (medical, dental, vision, life, and disability) • Flexible paid time off • Paid holidays • 401(k) plan with company match • Remote work

Apply Now

Similar Jobs

October 22

Data Engineer at Netflix collaborating with engineers to build scalable data pipelines. Transforming telemetry data into performance metrics for improved Quality of Experience.

Python

Spark

SQL

October 22

Data Engineer maintaining ETL projects at leading cardiac data management company. Collaborating with data, engineering, and clinical teams to build scalable data workflows.

Airflow

BigQuery

Cloud

Docker

ETL

Python

SQL

Tableau

October 22

Senior Data Engineer developing large-scale data infrastructure and pipelines at Alludo. Supporting cross-functional teams and ensuring data accuracy and consistency.

Airflow

AWS

Cloud

ETL

Google Cloud Platform

Kafka

Kubernetes

Spark

October 22

Senior Data Engineer at Alludo developing high-performance data infrastructure and pipelines. Leading data validation and telemetry processes while mentoring junior engineers.

Airflow

AWS

Cloud

ETL

Google Cloud Platform

Kafka

Kubernetes

Python

Spark

SQL

October 22

Senior Data Engineer responsible for designing and maintaining data pipelines at Care Access. Ensuring reliable data ingestion, transformation, and delivery across systems in healthcare research context.

Airflow

Azure

Cloud

Python

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com