Analytics Engineer, AWS

Job not on LinkedIn

October 27

🗣️🇧🇷🇵🇹 Portuguese Required

Apply Now
Logo of Leega

Leega

API • Artificial Intelligence • Cloud Solutions

Leega is a leading technology solutions provider in Latin America, specializing in data analytics and cloud solutions. As the first company in the region certified by Google Cloud for Data Analytics, Leega offers a range of services including application development, machine learning, and risk management analytics. The firm partners with major cloud services such as AWS and Microsoft Azure to help businesses enhance their data management and transition effectively to the cloud, ultimately driving digital transformation and innovation.

201 - 500 employees

Founded 2010

🔌 API

🤖 Artificial Intelligence

📋 Description

• Develop data pipelines using AWS Glue, S3 and Lambda, structuring data from various operational systems and railway sensors. • Create and maintain modular Kedro pipelines, ensuring traceability, versioning and reproducibility of analytical workflows. • Prepare, standardize and transmit operational data to regulatory bodies via API, ensuring compliance and data integrity. • Integrate and transform data for use in machine learning models on AWS SageMaker, connecting analytical results to business teams. • Implement DataOps best practices, CI/CD and version control with Git for analytical pipelines. • Design and optimize dimensional models focused on operational performance, transit time, productivity and route capacity. • Support Data Science and Operational Planning teams with consistent, highly available data for predictive and prescriptive analyses. • Ensure governance and quality of critical railway operations data.

🎯 Requirements

• Strong SQL skills and experience with Python (Pandas, PySpark, boto3). • Proven experience with AWS Glue, SageMaker, S3, Lambda and API Gateway. • Experience with Kedro for developing analytical pipelines. • Knowledge of data modeling (Kimball, Data Vault, Lakehouse). • Experience integrating with and consuming REST APIs. • Experience in DataOps, Git versioning and pipeline orchestration (Airflow, Step Functions, Dagster or similar). • Preferred: Power BI, Looker or other visualization tools. • Advantageous: experience with logistics, transport, port or railway operations data.

🏖️ Benefits

• Ongoing training and development

Apply Now

Similar Jobs

October 22

Analytics Engineer responsible for developing data pipelines. Collaborating with business and data science teams to support exploratory analyses and predictive models.

🗣️🇧🇷🇵🇹 Portuguese Required

Airflow

Amazon Redshift

AWS

Cloud

ETL

Kafka

Kubernetes

Oracle

Postgres

Python

SQL

October 17

Mid Senior Data Specialist at CI&T developing data ingestion pipelines and ensuring data quality. Collaborating with cross-functional teams on projects for international clients.

🗣️🇧🇷🇵🇹 Portuguese Required

SQL

October 10

Senior Data Insights Engineer at Trustly analyzing data architecture and visualization for innovative payment solutions. Collaborating with global teams to drive data-driven culture.

🗣️🇧🇷🇵🇹 Portuguese Required

PySpark

Python

SQL

May 23

Collaborate with teams to enhance data utilities as an Analytics Engineer at Arco Educação.

🗣️🇧🇷🇵🇹 Portuguese Required

Airflow

AWS

BigQuery

Cloud

Google Cloud Platform

Java

Python

Scala

Spark

SQL

Vault

Go

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com