
B2B • SaaS • Human Resources
EX Squared LATAM is a staffing solutions provider that specializes in connecting companies with highly skilled tech professionals from around the globe. Since 2001, they have developed a reputation for delivering vetted talent on time and within budget, operating primarily in the field of nearshore staffing. With over 550 certified experts, EX Squared LATAM focuses on meeting the high standards of their clients and is committed to providing excellent service and expertise in technology, engineering, and agile project management.
November 18
🗣️🇪🇸 Spanish Required

B2B • SaaS • Human Resources
EX Squared LATAM is a staffing solutions provider that specializes in connecting companies with highly skilled tech professionals from around the globe. Since 2001, they have developed a reputation for delivering vetted talent on time and within budget, operating primarily in the field of nearshore staffing. With over 550 certified experts, EX Squared LATAM focuses on meeting the high standards of their clients and is committed to providing excellent service and expertise in technology, engineering, and agile project management.
• Diseñar y mantener pipelines de datos en Azure Data Factory, integrando múltiples fuentes (bases relacionales, NoSQL, APIs, archivos) • Crear flujos de datos eficientes para ingesta, transformación y carga (ETL/ELT), aplicando buenas prácticas de automatización • Desarrollar notebooks y procesos en Azure Databricks usando PySpark, orientados a arquitecturas Medallón (Bronze/Silver/Gold) • Configurar y administrar conexiones seguras hacia Azure Data Lake (ABFSS, Unity Catalog, Key Vault) • Participar en la modelación de datos para consumo analítico (estrella, copo de nieve, Data Vault) • Colaborar con equipos técnicos y de negocio para traducir requerimientos en soluciones escalables • Monitorear el rendimiento de los flujos y optimizar la calidad, eficiencia y seguridad de los datos
• Experiencia de 5+ años trabajando con Azure Data Factory o herramientas similares de integración de datos • Conocimientos en Azure Databricks y PySpark, con disposición para seguir profundizando en este entorno • Experiencia en modelado de datos y manejo de Data Lakes o arquitecturas Lakehouse • Buen dominio de SQL y familiaridad con Python para automatización o transformación de datos • Comprensión de conceptos de gobernanza, seguridad y despliegue (DevOps / CI/CD) en entornos cloud • Capacidad de resolver problemas de forma estructurada, documentar tu trabajo y aprender de forma autónoma.
• Acceso a plataformas de aprendizaje • Mentorías y certificaciones técnicas • Horarios flexibles • Cultura inclusiva • Entorno colaborativo • Proyectos internacionales con tecnologías de vanguardia
Apply NowNovember 14
Data Engineer designing scalable infrastructure and data pipelines at Sonatype. Collaborating with product and engineering teams to ensure data reliability and accessibility.
🇨🇴 Colombia – Remote
💰 $80M Private Equity Round on 2018-09
⏰ Full Time
🟡 Mid-level
🟠 Senior
🚰 Data Engineer
AWS
Cloud
EC2
ETL
Java
Python
SQL
November 14
Senior Data Engineer at Sezzle developing large-scale data pipelines for insights and operational workflows. Collaborating across teams to enhance data reliability, scalability, and efficiency in a fintech environment.
Airflow
Amazon Redshift
AWS
Distributed Systems
ETL
Java
Python
Scala
SQL
November 10
201 - 500
Data Engineer collaborating with various areas at Tranqui to streamline data analytics. Focused on automation, infrastructure management, and data quality while using DataOps principles.
🗣️🇪🇸 Spanish Required
Amazon Redshift
BigQuery
ETL
Node.js
Pandas
Postgres
Python
TypeScript
November 9
Senior Data Engineer responsible for building and maintaining scalable data pipelines. Collaborating with teams to optimize data infrastructure and enable efficient data delivery in Colombia.
AWS
Azure
Cloud
Google Cloud Platform
Python
SQL
Terraform
November 8
Senior Data Engineer building and maintaining data pipelines for Xeus Solutions. Designing workflows, ensuring data quality, and collaborating with cross-functional teams.
Airflow
Amazon Redshift
AWS
Cloud
DynamoDB
MySQL
Postgres
Python
SQL