Ingeniero de Datos

Job not on LinkedIn

5 hours ago

🗣️🇪🇸 Spanish Required

Apply Now
Logo of Keyrus

Keyrus

B2B • Data Analytics • Consulting

Keyrus is an international company passionate about leveraging data to make impactful changes in life, society, and the future. With a presence in 18 countries, Keyrus is dedicated to creating meaningful careers for its employees by fostering excellence, trust, creativity, kindness, and fun. Specializing in data analytics, data advisory, and management, Keyrus provides vendor-agnostic solutions and continuous training opportunities through its KLX platform. The company also emphasizes a healthy work-life balance with a range of benefits, including sports events, healthcare plans, and an inclusive work culture. Keyrus is committed to openness and transparency to maintain positive workplace relationships and is driven to innovate and influence the digital future.

1001 - 5000 employees

Founded 1996

🤝 B2B

📋 Description

• Diseñar, desarrollar y mantener pipelines ETL/ELT en AWS. • Implementar transformaciones y lógica de procesamiento en AWS Glue (PySpark). • Construir integraciones con APIs REST (envío por lote, consultas de estado, armado/parsing de JSON). • Crear y mantener tablas de control y auditoría en Redshift. • Orquestar flujos y procesos mediante AWS Step Functions y EventBridge. • Asegurar la estabilidad operacional: reintentos, idempotencia, trazabilidad, logging y manejo de errores.

🎯 Requirements

• Profesional en Ingeniería de Sistemas, Ingeniería Industrial, Administración de Empresas o afines. • Experiencia superior a 3 años en el manejo de herramientas de ingeniería y analítica de datos, como Alteryx, Dataiku, Azure Data Factory y otros • Experiencia fuerte con AWS Lambda, AWS Glue (PySpark), Step Functions y EventBridge. • Dominio de Python (armado de JSON, consumo de APIs REST, manejo de errores). • Conocimiento avanzado de Amazon Redshift: SQL, SPs, modelado, optimización. • Experiencia integrando APIs (POST/GET, Basic Auth, manejo de excepciones). • Implementación de procesos por lotes y con incrementalidad. • Buenas prácticas de logging, trazabilidad y control de estados.

🏖️ Benefits

• Salud, ocio y bienestar • Programa de desarrollo profesional

Apply Now

Similar Jobs

2 days ago

Senior Data Engineer designing and implementing scalable Lakehouse solutions for a Fortune 500 client. Leading data pipeline projects in a modern engineering environment with Azure Databricks.

Azure

ETL

PySpark

Python

SQL

Unity

4 days ago

Marketing Data Architect / Scientist driving AI-enhanced marketing transformation for digital products. Unifying marketing data, building pipelines, and deploying models for effective strategies.

Airflow

BigQuery

Cloud

ETL

Google Cloud Platform

Numpy

Pandas

Python

Scikit-Learn

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com