Data Engineer, GCP

Job not on LinkedIn

November 13

🗣️🇫🇷 French Required

Apply Now
Logo of EY

EY

Finance • Consulting • Technology

EY is a global professional services firm, widely recognized for providing audit, tax, consulting, and advisory services. It focuses on helping its clients solve complex problems and transform their businesses using data and technology. EY is committed to building a better working world by enhancing trust in financial markets and economies globally. Its services include corporate finance advisory, transaction strategy, technology consulting, and more, catering to sectors such as health, energy, finance, government, and more. EY also emphasizes sustainability and innovation in its solutions.

10,000+ employees

Founded 1989

💸 Finance

📋 Description

• Concevoir et développer des pipelines ETL/ELT performants sur GCP (BigQuery, Dataflow, Dataproc, Pub/Sub, Composer). • Modéliser et maintenir des data warehouses et data lakes scalables et optimisés. • Définir et implémenter des architectures data cloud fiables et sécurisées, intégrant les meilleures pratiques DataOps. • Collaborer étroitement avec les équipes métiers et techniques pour co-construire des solutions data-driven. • Garantir la qualité, la sécurité et la gouvernance des données dans des environnements cloud complexes.

🎯 Requirements

• 3 à 5 ans d’expérience en Data Engineering sur Google Cloud Platform. • Maîtrise de BigQuery, Dataflow, Dataproc, Pub/Sub, Composer, SQL et Python. • Connaissance des architectures data modernes (Data Lake, Data Warehouse, Data Mesh). • Maîtrise des outils CI/CD, Git, Docker, Kubernetes, Terraform apprécié. • Capacité à transformer des besoins métiers en solutions techniques efficaces. • Curiosité technologique, rigueur, esprit d’équipe, autonomie, orientation client. • Excellente communication en français et en anglais et pédagogie.

🏖️ Benefits

• Un environnement de travail stimulant, collaboratif et international. • Des perspectives d’évolution et de développement professionnel. • Des projets à fort impact et une forte exposition stratégique. • L’opportunité de rejoindre un Big Four reconnu mondialement.

Apply Now

Similar Jobs

November 4

Devoteam

5001 - 10000

🤖 Artificial Intelligence

🔒 Cybersecurity

Data Engineer at Devoteam focusing on advanced data pipelines. Working with clients across diverse industries and maintaining the highest code quality standards.

🇫🇷 France – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

🗣️🇫🇷 French Required

October 28

Devoteam

5001 - 10000

🤖 Artificial Intelligence

🔒 Cybersecurity

Data Engineer at Devoteam designing and constructing data pipelines with Databricks. Collaborate with clients in various sectors like industry, banking, and energy for data transformation.

🇫🇷 France – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

🗣️🇫🇷 French Required

October 18

Devoteam

5001 - 10000

🤖 Artificial Intelligence

🔒 Cybersecurity

Data Engineer role in Lyon, building robust data pipelines with Databricks for diverse clients. Focus on innovation and continuous development in a collaborative environment.

🇫🇷 France – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

🗣️🇫🇷 French Required

August 27

Devoteam

5001 - 10000

🤖 Artificial Intelligence

🔒 Cybersecurity

Data Architect/Engineer at Devoteam implementing Big Data pipelines and ETL, supporting digital transformation for enterprise clients.

🇫🇷 France – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

🗣️🇫🇷 French Required

August 11

Hitachi Solutions America

501 - 1000

🏢 Enterprise

🤖 Artificial Intelligence

Senior Data Engineer at Hitachi Solutions Europe, a global digital data consultancy. Designs and builds cloud data pipelines with Spark, Azure Synapse, Databricks and Fabric.

🇫🇷 France – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

🗣️🇫🇷 French Required

Developed by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com