Senior Data Engineer

Job not on LinkedIn

October 17

Apply Now
Logo of Enroute

Enroute

SaaS • Enterprise • Telecommunications

Enroute is a company that delivers exceptional IT services and solutions, staffed by a team of passionate, problem-solving individuals skilled in various IT and business practices. They specialize in data engineering, moving and utilizing data effectively to increase business value, regardless of data sources or types. Enroute also offers software development services, providing analysis and implementing software strategies tailored to long-term organizational needs. Quality engineering is another key offering, with a focus on building quality into processes and software testing from the early stages of development. With a strong emphasis on IT strategy, software and data solutions, Enroute aims to impact organizations and improve user experiences through their tailored services and expert solutions.

📋 Description

• Design, develop, and maintain ETL/ELT pipelines that connect multiple data sources. • Work with SQL, Python, and major cloud data warehouses (Snowflake, Redshift, BigQuery, or similar). • Develop and optimize data models to enable analytics and machine learning initiatives. • Ensure data governance, quality, and security across all pipelines. • Collaborate closely with cross-functional teams to translate business needs into data solutions.

🎯 Requirements

• 5+ years of experience as a Data Engineer. • Strong skills in SQL and experience with both relational and non-relational databases. • Hands-on experience with data pipeline tools and cloud platforms. • Python experience for data manipulation (pandas, PySpark, etc.). • Familiarity with data visualization or reporting tools (Power BI, Looker, Tableau, etc.). • Strong communication skills and a proactive, problem-solving mindset. • **Nice to have** • Experience with real-time streaming (Kafka, Kinesis, Spark Streaming). • Knowledge of Infrastructure as Code (Terraform, CloudFormation). • Exposure to machine learning or data science workflows. • Experience in DevOps practices for data (CI/CD, monitoring).

🏖️ Benefits

• Monetary compensation • Year-end Bonus • IMSS, AFORE, INFONAVIT • Major Medical Expenses Insurance • Minor Medical Expenses Insurance • Life Insurance • Funeral Expenses Insurance • Preferential rates for car insurance • TDU Membership • Holidays and Vacations • Sick days • Bereavement days • Civil Marriage days • Maternity & Paternity leave • English and Spanish classes • Performance Management Framework • Certifications • TALISIS Agreement: Discounts at ADVENIO, Harmon Hall, U-ERRE, UNID • Taquitos Rewards • Amazon Gift Card on your Birthday • Work-from-home Bonus • Laptop Policy**

Apply Now

Similar Jobs

October 10

Senior Data Engineer creating distributed data platforms and scalable pipelines for international projects at Dresden Partners. Requires advanced English and expertise in Spark, Scala, and SQL.

🗣️🇪🇸 Spanish Required

Amazon Redshift

BigQuery

ETL

Java

Kafka

Python

Scala

Spark

SQL

September 19

Big Data Engineer building ELT pipelines with Airflow, Snowflake at global BPO/IT services firm. Focus on AWS, dbt, Python, and data governance.

Airflow

Apache

AWS

Cloud

Python

SQL

September 10

Design, develop, and optimize Microsoft Fabric ETL pipelines and data models. Collaborate with analysts and data scientists at a global IT/BPO services provider.

🗣️🇪🇸 Spanish Required

Azure

Cloud

ETL

Python

SQL

September 7

Build and optimize Databricks/Apache Spark data pipelines for analytics. Collaborate with data scientists and stakeholders at a global IT consulting and BPO firm.

🗣️🇪🇸 Spanish Required

Apache

AWS

Azure

Cloud

Java

Python

Scala

Spark

SQL

September 7

Azure Data Engineer building and optimizing Azure data pipelines and Synapse solutions for a global BPO and consulting firm.

🗣️🇪🇸 Spanish Required

Azure

Cloud

ETL

Python

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com