Data Engineer, GCP

Job not on LinkedIn

September 21

Apply Now
Logo of T-Systems International

T-Systems International

Enterprise • SaaS • Security

T-Systems International is a leading European IT service provider that plays a significant role in digital transformation by offering consulting, cloud services, digital solutions, security, and connectivity. Known for its holistic and integrated approach, T-Systems combines comprehensive industry expertise with advanced technology solutions to support various sectors including automotive, healthcare, public sector, financial services, and manufacturing. The company's offerings encompass AI strategies, cloud migration services, application services, and cybersecurity, making it a key partner for clients aiming for a future-proof digital environment. With a focus on sustainability and innovation, T-Systems provides tailored solutions globally, supported by a wealth of customer success stories and recognitions for excellence.

10,000+ employees

Founded 2000

🏢 Enterprise

☁️ SaaS

🔐 Security

📋 Description

• Deploy and manage infrastructure on Google Cloud Platform (GCP) including network architectures, security implementations, DNS, VPN, and load balancing. • Use Hadoop/Hive and PySpark for querying and transforming data; implement job orchestration with Airflow. • Manage core GCP services (GKE, Cloud Run, BigQuery, Compute Engine, Composer) primarily via Terraform. • Develop and implement Python applications for various GCP services. • Integrate data sources from CDI, Datendrehscheibe (FTP), TARDIS APIs, and Google Cloud Storage. • Create and deploy workloads across Development (DEV), Testing (TEST), and Production (PROD) environments. • Implement security and compliance measures, manage IAM policies and secrets, and enforce identity-aware policies. • Implement AI solutions using Google Vertex AI for building and deploying machine learning models. • Operate and support data products on ODE and collaborate within Technik Value Stream teams.

🎯 Requirements

• Must be a certified GCP Cloud Architect or Data Engineer. • Strong proficiency in Google Cloud Platform (GCP). • Expertise in Terraform for infrastructure management. • Skilled in Python for application implementation. • Experience with GitLab CI/CD (Magenta) for automation. • Deep knowledge of network architectures (Shared VPC, Hub-and-Spoke) and security (IAM, Secret Manager, firewalls, Identity-Aware Proxy). • Experience managing GKE, Cloud Run, BigQuery, Compute Engine, and Composer. • Experience with Hadoop, Hive, and PySpark for data processing and transformations. • Experience with Airflow for job orchestration. • Familiarity integrating data sources (CDI, Datendrehscheibe FTP servers, TARDIS APIs, Google Cloud Storage). • Experience deploying across DEV, TEST, and PROD environments. • Experience implementing AI solutions using Vertex AI. • Please send CV in English.

🏖️ Benefits

• International, positive, dynamic, and motivated work environment. • Hybrid work model (telework/face-to-face). • Flexible schedule. • Continuous training. • Flexible Compensation Plan. • Life and accident insurance. • More than 26 working days of vacation per year. • Support network, excellent technology, new work environment, freedom to work autonomously. • Continuous development opportunities.

Apply Now

Similar Jobs

September 19

Design and evolve dLocal's scalable data platform for payments in emerging markets. Mentor engineers and drive data governance and architecture decisions.

Airflow

Apache

AWS

Cloud

Google Cloud Platform

Python

Spark

SQL

September 17

Build and maintain scalable ETL/ELT pipelines and lakehouse infrastructure; enable AI-driven analytics and data governance for Rithum's commerce platform.

Airflow

Amazon Redshift

Apache

AWS

Cloud

Docker

ETL

Kafka

Kubernetes

RDBMS

Spark

SQL

September 17

Senior Data Engineer building AI-enhanced data infrastructure for Rithum’s commerce network. Design scalable ETL/ELT pipelines and mentor engineers.

Airflow

Amazon Redshift

Apache

AWS

Cloud

Docker

ETL

Kafka

Kubernetes

RDBMS

Spark

SQL

August 20

Mid Data Engineer building data architecture and storage solutions for Volkswagen Group Services. Lead technical data strategy and implement cloud-based data platforms.

AWS

Azure

Cloud

NoSQL

Spark

SQL

Vault

August 12

Join Prima’s Engineering team to bridge ML/data science with engineering. Build data products and pipelines for motor insurance growth.

Airflow

Amazon Redshift

Apache

AWS

Cloud

Kafka

NoSQL

Numpy

Open Source

Pandas

Postgres

Python

RDBMS

Scikit-Learn

Spark

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com