Senior Data Engineer - Growth

June 25

Apply Now
Logo of Flex

Flex

Fintech • Real Estate • B2C

Flex is a financial technology company that enables renters to split and schedule rent payments, pay move-in costs over time, and build credit through on-time rent reporting. It offers consumer-facing payment products (rent split, pay-over-time, and move-in loans) and partners with property owners and property-management systems to help properties collect on-time rent and retain residents. Flex operates via bank partnerships and regulated lending/servicing subsidiaries to provide loans and payment processing while emphasizing security and regulatory compliance.

201 - 500 employees

Founded 2019

💳 Fintech

🏠 Real Estate

👥 B2C

📋 Description

• Design, implement, and maintain high-quality data pipeline services, including but not limited to Data Lake, Kafka, Amazon Kinesis, and data access layers. • Develop robust and efficient DBT models and jobs to support analytics reporting and machine learning modeling. • Closely collaborating with the Analytics team for data modeling, reporting, and data ingestion. • Closely collaborate with the Data science team to build the data set for the ML model. • Create scalable real-time streaming pipelines and offline ETL pipelines. • Design, implement, and manage a data warehouse that provides secure access to large datasets. • Continuously improve data operations by automating manual processes, optimizing data delivery, and redesigning infrastructure for greater scalability. • Create engineering documentation for design, runbooks, and best practices.

🎯 Requirements

• A minimum of 6 years of industry experience in the data infrastructure/data engineering domain. • A minimum of 6 years of experience with Python and SQL. Java experience is a plus. • A minimum of 3 years of industry experience using DBT. • A minimum of 3 years of industry experience using Snowflake and its basic features. • Familiarity with AWS services, with industry experience using Lambda, Step Functions, Glue, RDS, EKS, DMS, EMR, etc. • Industry experience with different big data platforms and tools such as Snowflake, Kafka, Hadoop, Hive, Spark, Cassandra, Airflow, etc. • Industry experience working with relational and NoSQL databases in a production environment. • Strong fundamentals in data structures, algorithms, and design patterns.

🏖️ Benefits

• Competitive pay • 100% company-paid medical, dental, and vision • 401(k) + company equity • Unlimited paid time off + 13 company paid holidays • Parental leave • Flex Cares Program: Non-profit company match + pet adoption coverage • Free Flex subscription

Apply Now

Similar Jobs

June 23

Senior Azure Data Engineer role enhancing structured and unstructured data for client insights.

Azure

JavaScript

SQL

SSIS

June 18

Data Engineer role focusing on Python programming, ETL development, and data modeling. Engaging in data migration across storage systems while working in a remote environment.

ETL

Pandas

PySpark

Python

June 18

Data Architect creating the application's data architecture. Requires expertise in system integration, data modeling, and third-party integrations.

ETL

June 14

Join a tech company as a Data Engineer focusing on AWS analytics solutions.

Airflow

Amazon Redshift

Apache

AWS

Cloud

DynamoDB

ETL

Python

SQL

Terraform

June 13

As a Data Engineer at Digital Media Solutions, design and maintain data infrastructure and pipelines to ensure timely data accessibility.

Amazon Redshift

AWS

Cloud

Docker

DynamoDB

ETL

JavaScript

Kubernetes

MySQL

NoSQL

Postgres

Python

SQL

Terraform

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com