Mid-level Data Engineer, GCP

Job not on LinkedIn

November 10

🗣️🇧🇷🇵🇹 Portuguese Required

Apply Now
Logo of Stefanini Brasil

Stefanini Brasil

Artificial Intelligence • Cybersecurity • Cloud

Stefanini Brasil is a leading provider of digital transformation solutions, offering a range of services including artificial intelligence, cybersecurity, cloud enablement, and consulting. With over 35 years of experience, the company focuses on integrating innovative technologies to help organizations enhance their operations and customer experiences across various industries. Their expertise extends to sectors like healthcare, retail, and industrial goods, enabling businesses to optimize processes and drive value through technology.

📋 Description

• Design, develop and maintain data pipelines using Dataflow on GCP. • Integrate and transform data from various sources such as Cloud Storage, Pub/Sub, BigQuery and PostgreSQL. • Ensure data quality, security and governance across all stages of the process. • Collaborate with BI, Analytics and Engineering teams to deliver data-driven solutions. • Optimize performance and cost of cloud solutions. • Document processes and data engineering best practices.

🎯 Requirements

• Proven experience with Google Cloud Platform, especially Dataflow, Cloud Storage and Pub/Sub. • Knowledge of PostgreSQL databases and data modeling. • Experience with ETL/ELT processes. • Familiarity with version control tools (Git) and CI/CD. • Ability to work in agile, collaborative environments. • Intermediate English for documentation and communication.

🏖️ Benefits

• Meal or food allowance • Discounts on courses, universities and language schools • Stefanini Academy - platform offering free, up-to-date online courses with certificates • Mentoring • Childcare assistance • Benefits club for medical consultations and exams • Health insurance • Dental insurance • Perks and discounts at top establishments • Travel club • Pet care benefits • and much more...

Apply Now

Similar Jobs

November 10

Senior Data Developer creating scalable data solutions using Databricks. Integrating and transforming data from various sources while mentoring peers and collaborating with cross-functional teams.

🗣️🇧🇷🇵🇹 Portuguese Required

AWS

Azure

Cloud

ETL

Google Cloud Platform

PySpark

Python

SQL

November 6

Jusbrasil

201 - 500

Data Engineer at Jusbrasil responsible for building data ingestion pipelines and collaborating with various teams. Utilizing advanced data architecture and engineering practices to transform raw data into valuable insights.

🗣️🇧🇷🇵🇹 Portuguese Required

Airflow

BigQuery

Cloud

Google Cloud Platform

Java

Kafka

Python

Scala

SQL

Terraform

Vault

November 6

Data Architect responsible for designing and implementing modern data architectures. Collaborating across teams to ensure data solutions align with business objectives at Aggrandize.

🗣️🇧🇷🇵🇹 Portuguese Required

Azure

ETL

Python

SQL

Unity

November 6

Engenheiro de Dados desenvolvendo e mantendo pipelines de dados para a Serasa Experian. Colaborando com equipes de engenharia e garantindo a integridade e disponibilidade dos dados.

🗣️🇧🇷🇵🇹 Portuguese Required

Airflow

AWS

ETL

Java

PySpark

Python

Scala

SQL

November 5

Data Engineer focused on data analytics building dashboards and collaborating with teams. Work remotely with a focus on innovative solutions in financial services.

🗣️🇧🇷🇵🇹 Portuguese Required

Azure

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com