Mid-Level Data Engineer

November 20

Apply Now
Logo of Lean Tech

Lean Tech

B2B • Recruitment • SaaS

Lean Tech is a staff augmentation company that specializes in offering businesses the talent and support they need through a proven nearshore and offshore model. The company helps in establishing remote satellite offices in Latin America and the Philippines, building mission-critical teams across various business functions such as operations, technology, marketing, and sales. Lean Tech focuses on workforce optimization, providing tailored solutions to align people, processes, and technology to enhance customer experiences while minimizing costs. The company also emphasizes social impact by engaging in corporate social responsibility initiatives.

📋 Description

• Design, build, and optimize ELT pipelines for data ingestion and transformation. • Develop and maintain Snowflake data warehouse models to enable reliable reporting. • Implement and manage transformations in DBT. • Create and support Python-based data integrations within GCP for shuttling raw data into Snowflake. • Collaborate closely with data leaders and stakeholders to ensure accuracy, performance, and scalability. • Ensure data quality through validation, monitoring, and troubleshooting of pipelines. • Document data models, transformations, and system integrations. • Support BI and reporting needs by enabling downstream data availability for Power BI dashboards. • Contribute to improving processes, proactively addressing gaps, and ensuring alignment with business expectations.

🎯 Requirements

• 3+ years in data engineering or backend systems with a strong data focus. • Hands-on experience with: • Snowflake (data modeling and warehouse optimization). • DBT (data transformation layer). • Python (for building data pipelines and integrations). • GCP (data ingestion/storage workflows). • Strong SQL skills for querying, modeling, and performance tuning. • Experience working with ELT data patterns.

🏖️ Benefits

• Professional development opportunities with international customers • Collaborative work environment • Career path and mentorship programs

Apply Now

Similar Jobs

November 18

Data Engineer designing, building and optimizing data flows on Azure Cloud for international high-impact projects. Collaborating with business and data science teams to deliver scalable solutions.

🗣️🇪🇸 Spanish Required

Azure

Cloud

ETL

NoSQL

PySpark

Python

SQL

Unity

Vault

November 14

Data Engineer designing scalable infrastructure and data pipelines at Sonatype. Collaborating with product and engineering teams to ensure data reliability and accessibility.

🇨🇴 Colombia – Remote

💰 $80M Private Equity Round on 2018-09

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

AWS

Cloud

EC2

ETL

Java

Python

SQL

November 14

Senior Data Engineer at Sezzle developing large-scale data pipelines for insights and operational workflows. Collaborating across teams to enhance data reliability, scalability, and efficiency in a fintech environment.

🇨🇴 Colombia – Remote

💵 $5k - $9k / month

⏰ Full Time

🟠 Senior

🚰 Data Engineer

Airflow

Amazon Redshift

AWS

Distributed Systems

ETL

Java

Python

Scala

SQL

November 10

Data Engineer collaborating with various areas at Tranqui to streamline data analytics. Focused on automation, infrastructure management, and data quality while using DataOps principles.

🗣️🇪🇸 Spanish Required

Amazon Redshift

BigQuery

ETL

Node.js

Pandas

Postgres

Python

TypeScript

November 9

Senior Data Engineer responsible for building and maintaining scalable data pipelines. Collaborating with teams to optimize data infrastructure and enable efficient data delivery in Colombia.

🇨🇴 Colombia – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

AWS

Azure

Cloud

Google Cloud Platform

Python

SQL

Terraform

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com