AWS Data Engineer

October 31

Apply Now
Logo of Capgemini

Capgemini

Enterprise • Artificial Intelligence • Cybersecurity

Capgemini is a global leader in partnering with businesses to transform and manage their operations by harnessing the power of technology. With expertise across a wide array of industries such as aerospace, automotive, banking, and healthcare, Capgemini provides a constantly evolving portfolio of services to meet the ever-changing needs of their clients. Their offerings include cloud, cybersecurity, data and artificial intelligence, and enterprise management, among others. Capgemini also emphasizes innovation and sustainability, helping companies achieve digital transformation while promoting environmental and social responsibility. Additionally, Capgemini provides career opportunities across various levels and professions, encouraging innovation and diversity in its workforce.

10,000+ employees

Founded 1967

🏢 Enterprise

🤖 Artificial Intelligence

🔒 Cybersecurity

📋 Description

• Design, build, and maintain reusable, modular, and configuration-driven frameworks for ingesting both historical and incremental data from diverse sources into Iceberg tables on AWS S3. • Expose ingested data to Snowflake via Snowflake external tables, ensuring seamless integration and accessibility. • Implement robust logging mechanisms to monitor all data processes, ensuring completeness, timeliness, accuracy, and validity (ABC metrics). • Configure automated notifications to alert support teams of process statuses and anomalies. • Adhere to architectural standards and development best practices throughout the lifecycle. • Translate complex business requirements into scalable and efficient technical solutions. • Independently plan and execute the implementation of new data capabilities, including: Development of project plans with clear milestones and delivery timelines; Task breakdown, assignment, and management; Comprehensive documentation and tracking of work using Rally or equivalent tools; Identification and management of dependencies across cross-functional teams. • Coordinate effectively with internal and external stakeholders, including: Cloud Operations; Information Security; Business Units; Other Development Teams; Facilitate alignment and secure commitment from partner teams to meet project deliverables and dependency timelines. • Communicates complex technical concepts to technical and non-technical personnel. • Delivers routine progress and status to stakeholders. • Communicates information in line with the target audience experience, background, and expectations; uses terms, examples, and analogies that are meaningful to the audience. • Ensures accuracy of information communicated to effectively support project leadership decision making. • Proactively accumulates and maintains knowledge of current and emerging/evolving technologies, concepts, and trends in the IT field. • Provides input on improving or enhancing existing organizational processes based on lessons learned and experiences from project work. • Performs root cause analysis to quickly identify and resolve issues causing recurring technical problems. • Demonstrates a high degree of independence and ownership in driving initiatives from concept to completion. • Proactively identifies challenges and inefficiencies, and takes swift action to resolve them without waiting for direction. • Navigates complex organizational structures to engage the right stakeholders and ensure timely delivery. • Maintains a solution-oriented mindset, continuously seeking opportunities to improve processes, enhance collaboration, and deliver value.

🎯 Requirements

• Minimum of 2–4 years of hands-on experience in data engineering within the AWS ecosystem. • At least 4 years of total IT experience including demonstrated success as a software developer. • Full English Fluency • Data Processing & Orchestration: Spark, AWS Glue, AWS Step Functions, and EMR. (MUST) • Storage & Lakehouse Architecture: S3, Iceberg, and Snowflake External Tables. • Security & Access Management: IAM and Lake Formation. • Monitoring & Logging: CloudWatch for operational visibility and alerting. • Development & Automation: Python & Jenkins programming skills and experience with CI/CD pipelines for automated deployment and testing. • Architecture & Design: Understanding of data lake and lakehouse architectures, modular and configuration-driven development, and scalable ingestion frameworks. • Cloud certifications. • Good exposure to Agile software development and DevOps practices such as Infrastructure as Code (IaC), Continuous Integration and automated deployment. • Strong practical application development experience on Linux and Windows-based systems. • Proven ability to work independently and collaboratively across cross-functional teams, with excellent verbal and written communication skills. • Agile Methodologies: Familiarity with Agile development practices and tools such as Rally or similar project tracking systems. • Experience working directly with customers, partners and third-party developers.

🏖️ Benefits

• Competitive salary and performance-based bonuses • Comprehensive benefits package • Career development and training opportunities • Flexible work arrangements (remote and/or office-based) • Dynamic and inclusive work culture within a globally renowned group • Private Health Insurance • Pension Plan • Paid Time Off • Training & Development

Apply Now

Similar Jobs

October 30

SEGULA Technologies

10,000+ employees

🚀 Aerospace

⚡ Energy

Data Analyst supporting engineering and design in automotive systems. Analyzing data, maintaining reports, and collaborating with teams to enhance decision-making.

🇲🇽 Mexico – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

VBA

October 28

Paquetexpress

5001 - 10000

Data Architect designing robust database structures linked to business objectives for PAQUETEXPRESS. Implementing analytical solutions focused on BI and Big Data while ensuring data governance and security.

🇲🇽 Mexico – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

🗣️🇪🇸 Spanish Required

September 19

Sequoia Connect

11 - 50

🎯 Recruiter

👥 HR Tech

🏢 Enterprise

Big Data Engineer building ELT pipelines with Airflow, Snowflake at global BPO/IT services firm. Focus on AWS, dbt, Python, and data governance.

🇲🇽 Mexico – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

September 10

Sequoia Connect

11 - 50

🎯 Recruiter

👥 HR Tech

🏢 Enterprise

Design, develop, and optimize Microsoft Fabric ETL pipelines and data models. Collaborate with analysts and data scientists at a global IT/BPO services provider.

🇲🇽 Mexico – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

🗣️🇪🇸 Spanish Required

September 7

Sequoia Connect

11 - 50

🎯 Recruiter

👥 HR Tech

🏢 Enterprise

Build and optimize Databricks/Apache Spark data pipelines for analytics. Collaborate with data scientists and stakeholders at a global IT consulting and BPO firm.

🇲🇽 Mexico – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

🗣️🇪🇸 Spanish Required

Developed by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com