Data Architecture Engineer

Job not on LinkedIn

6 hours ago

Apply Now
Logo of Hitachi

Hitachi

Artificial Intelligence • Energy • Transport

Hitachi, Ltd. is a global company that operates in various sectors with a focus on social innovation. Hitachi provides advanced digital solutions, services, and technologies under its brand 'Lumada' to solve customers’ challenges and enhance societal value. The company is committed to sustainability and environmental, social, and governance (ESG) initiatives, aiming to improve quality of life through technologies that foster a sustainable society. Hitachi's products and solutions span digital systems, green energy, mobility, and connective industries, promoting digital transformation of business systems and social infrastructure. Founded in 1910, Hitachi's long-standing dedication to innovation and societal development makes it a leader in driving positive change worldwide.

25801 - 25801 employees

Founded 1910

🤖 Artificial Intelligence

⚡ Energy

🚗 Transport

📋 Description

• Design, develop, and maintain scalable ETL/ELT data pipelines on AWS. • Ingest data from a wide range of sources into centralized Data Lakes and Data Warehouses. • Collaborate with DevOps and ingestion teams to ensure smooth and reliable data delivery. • Monitor production workflows and pipelines, ensuring timely issue detection and resolution. • Ensure adherence to coding best practices and maintain comprehensive documentation. • Support and enhance existing ETL workflows to improve performance and reliability. • Work closely with reporting teams to enable data accessibility via tools like Power BI. • Uphold high standards of data quality, governance, and reliability across the data infrastructure. • Provide strategic nearshore support in the Mexico region to enable 24/7 operational coverage.

🎯 Requirements

• 3-5 years of hands-on experience in data engineering and ETL development. • Good knowledge in Python for data transformation and scripting tasks. • Very strong SQL skills are a must – ability to write complex queries and perform data analysis. • Proven experience working on AWS cloud platform (e.g., S3, Glue, Lambda, EMR, Redshift). • Solid understanding of data lake and data warehouse architectures. • Experience with other cloud platforms (Azure or GCP) is a plus. • Familiarity with data modeling concepts and schema design (Star/Snowflake schema). • Experience integrating with reporting tools like Power BI. • Strong troubleshooting skills and ability to proactively monitor and maintain production systems. • Demonstrated ability to work collaboratively in cross-functional teams across geographies.

🏖️ Benefits

• Industry-leading benefits that go far beyond compensation • Support, services, and resources that also take care of your holistic health and wellbeing • Flexible arrangements that work for you (role and location dependent)

Apply Now

Similar Jobs

4 days ago

Data Engineer at EX Squared LATAM creating scalable e-commerce data models for analytics. Contributing to data platform migration and enhancing modern analytics ecosystems.

Amazon Redshift

AWS

Azure

BigQuery

Cloud

SQL

4 days ago

Data Migration Engineer responsible for developing data migration strategies and processes at 8am fintech. Collaborating with clients to ensure accurate and efficient data migration.

ETL

MySQL

Postgres

Python

Ruby

Ruby on Rails

SQL

5 days ago

Senior Data Engineer at Derevo creating and implementing modern data architectures with Big Data technologies. Responsible for high-quality analytical solutions and data management practices.

🗣️🇪🇸 Spanish Required

Apache

Azure

ETL

PySpark

Python

Spark

SQL

Unity

5 days ago

Senior Data Engineer leading the design and implementation of data pipelines at a technology-led marketing firm. Collaborating with senior leaders to deliver robust data engineering solutions.

BigQuery

Cloud

Distributed Systems

ETL

Google Cloud Platform

6 days ago

Senior Data Engineer at Qualifinds architecting AI-driven valuation tools and market analytics. Leading development of data pipelines and collaborating across engineering and product teams.

Airflow

ETL

Postgres

Python

SQL

Terraform