Data Engineer

November 1

🗣️🇪🇸 Spanish Required

Apply Now
Logo of Inetum

Inetum

B2B • Enterprise • SaaS

Inetum is a European leader in digital services, providing technology consulting, solutions, and software innovation to businesses and public sector entities across 19 countries. With a workforce of 28,000 consultants, Inetum focuses on helping clients achieve digital transformation through a broad range of services, including consulting, software integration, and outsourcing, while prioritizing innovation, customer experience, and agility. Inetum partners with major software providers and offers solutions in vertical sectors like public sector, insurance, and healthcare. In 2023, Inetum achieved sales of 2. 5 billion euros, driven by its growth and scale ambitions.

10,000+ employees

🤝 B2B

🏢 Enterprise

☁️ SaaS

💰 Post-IPO Equity on 2007-03

📋 Description

• Formará parte de un equipo Data para la implementación del nuevo modulo de operaciones del cliente en la nueva plataforma Microsoft Fabric.

🎯 Requirements

• Carrera universitaria de ingeniería o equivalente • Mínimo 3 años de experiencia como ingeniero de datos • Conocimiento de pyspark • Conocimiento de herramientas de ETL, ya sea datafactory, databricks o Sql server integration services • Se valorará muy bien si ha trabajado en Fabric, pero al ser tan nuevo es un añadido y no requerimiento. • Idioma catalán requerido.

🏖️ Benefits

• Programa de formación por parte de la empresa para que puedas seguir desarrollándote y promocionar dentro del plan de carrera que existe para ti. • Contrato indefinido y estabilidad. • Retribución flexible y más beneficios • Flexibilidad horaria. • Modalidad de teletrabajo híbrido o remoto. • Te ofrecemos un entorno dinámico, en el que tu plan de carrera y crecimiento será nuestro objetivo. • Buen ambiente de trabajo, abiertos e inclusivos. • Formarás parte de un gran equipo de profesionales con inquietud y motivación por la tecnología.

Apply Now

Similar Jobs

October 29

Data Architect at GT Motive designing hybrid data architectures between AWS and Azure for efficient data integration. Collaborating with teams to propose data-driven solutions in a flexible remote environment.

AWS

Azure

Cloud

MongoDB

SQL

October 10

Senior Data Engineer managing data platform for experience analytics company. Collaborating on data workflows and building reliable data pipelines with modern technologies.

Airflow

Cloud

Python

SQL

September 21

Data Engineer on GCP building and operating data products at T-Systems. Develop infrastructure, data pipelines, CI/CD, security, and AI solutions using Terraform, Python, BigQuery, and Vertex AI.

Airflow

BigQuery

Cloud

DNS

Firewalls

Google Cloud Platform

Hadoop

PySpark

Python

Terraform

September 19

Design and evolve dLocal's scalable data platform for payments in emerging markets. Mentor engineers and drive data governance and architecture decisions.

Airflow

Apache

AWS

Cloud

Google Cloud Platform

Python

Spark

SQL

September 17

Build and maintain scalable ETL/ELT pipelines and lakehouse infrastructure; enable AI-driven analytics and data governance for Rithum's commerce platform.

Airflow

Amazon Redshift

Apache

AWS

Cloud

Docker

ETL

Kafka

Kubernetes

RDBMS

Spark

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com