November 6
🗣️🇧🇷🇵🇹 Portuguese Required
• Design and maintain large-scale data ingestion and transformation pipelines (batch and streaming). • Ensure data quality, reliability, and integrity across the entire lifecycle (testing, versioning, monitoring). • Implement and evolve data architecture on GCP (BigQuery, Cloud Storage, Pub/Sub, Composer). • Collaborate with Analytics, Product, and Engineering teams to define data needs and translate business requirements into technical solutions. • Contribute to the development of data engineering best practices and internal standardization of models, naming conventions, and monitoring.
• Strong experience with Python and SQL (preferably also Java or Scala). • Experience with Airflow, Airbyte, or equivalent orchestration and replication frameworks. • Knowledge of data modeling and design best practices (Kimball, Data Vault, Medallion). • Experience with GCP (BigQuery, Cloud Storage, Pub/Sub, Composer, IAM). • Familiarity with streaming (Kafka, Dataflow, or Beam) is a plus. • Experience with infrastructure-as-code (Terraform) and CI/CD pipelines. • Product-oriented mindset and cross-functional collaboration.
• N/A
Apply NowNovember 6
Data Architect responsible for designing and implementing modern data architectures. Collaborating across teams to ensure data solutions align with business objectives at Aggrandize.
🗣️🇧🇷🇵🇹 Portuguese Required
November 5
Data Engineer focused on data analytics building dashboards and collaborating with teams. Work remotely with a focus on innovative solutions in financial services.
🇧🇷 Brazil – Remote
đź’° $80M Venture Round on 2000-05
⏰ Full Time
🟡 Mid-level
đźź Senior
đźš° Data Engineer
🗣️🇧🇷🇵🇹 Portuguese Required
November 4
Data Engineer constructing and maintaining data solutions for security analytics and fraud prevention at Sicredi. Focused on data quality and integrity for strategic analysis.
🗣️🇧🇷🇵🇹 Portuguese Required
November 4
Data Engineer building modern security platform at Sicredi, ensuring data integrity and enabling strategic analysis for security.
November 4
2 - 10
Senior Data Engineer responsible for designing and optimizing Data Lake architectures and data pipelines on AWS. Collaborating with cross-functional teams to deliver scalable solutions.