
Recruitment • B2B
EVT is a nearshore staff augmentation company that connects global businesses with top-tier IT professionals from Brazil. They specialize in sourcing, vetting, and delivering developers and IT talent aligned to U. S. time zones, offering cost-effective hiring solutions with cultural and language alignment. EVT manages the end-to-end process—candidate search, technical and cultural validation, contracts, onboarding, and ongoing support—serving companies seeking scalable, agile access to Brazilian tech talent.
October 8
🗣️🇧🇷🇵🇹 Portuguese Required

Recruitment • B2B
EVT is a nearshore staff augmentation company that connects global businesses with top-tier IT professionals from Brazil. They specialize in sourcing, vetting, and delivering developers and IT talent aligned to U. S. time zones, offering cost-effective hiring solutions with cultural and language alignment. EVT manages the end-to-end process—candidate search, technical and cultural validation, contracts, onboarding, and ongoing support—serving companies seeking scalable, agile access to Brazilian tech talent.
• We are seeking a Data & Governance Architect with solid experience in solution architecture, technical leadership, and corporate governance in modern data environments. • The professional will be responsible for defining and implementing scalable architectures on Lakehouse platforms, promoting best practices in governance and DataOps, and acting as a strategic leader for technical and business teams, connecting data to corporate value at global scale. • Design modern Lakehouse architectures using platforms such as Databricks (Delta Lake, Unity Catalog, Delta Live Tables, Lakehouse Federation). • Define ingestion and transformation patterns (batch, streaming, event-driven) using Auto Loader, Structured Streaming, and Liquid Clustering. • Model data in Medallion architectures (Raw, Silver, Gold), including consumption layers (serving, APIs, BI, ML). • Structure integrations with legacy systems, SaaS, SAP, DMS, ERPs, and corporate applications. • Support Data Mesh initiatives and data domains, focusing on scalability and ownership. • Connect data to Analytics, Data Science, and GenAI, enabling advanced use cases (prediction, recommendation, copilots). • Evaluate and recommend architectural patterns (streaming-first, event-driven, real-time analytics). • Lead design reviews and proofs of concept for new data solutions. • Support cost-optimization strategies (compute, storage, caching, auto-scaling). • Document and disseminate corporate data architecture blueprints. • Define and operationalize corporate Data Governance frameworks (catalog, glossary, lineage, ownership, stewardship, quality). • Implement security, privacy, and compliance policies (LGPD, GDPR, HIPAA), including anonymization and encryption. • Establish data classification policies (sensitivity, retention, criticality) applied across catalogs and data layers. • Conduct data governance maturity assessments and develop evolution roadmaps. • Support the design and execution of governance operating models (Hub & Spoke, centralized, federated). • Introduce governance automations with Unity Catalog and AI/GenAI for classification, automatic lineage, and glossary suggestions. • Lead processes to define ownership and stewardship with business areas. • Establish governance metrics and KPIs (catalog coverage, quality levels, glossary adoption, access metrics). • Support AI Governance initiatives, including model cataloging and model lifecycle management. • Act as a bridge between business, security, and engineering teams, ensuring adoption and corporate alignment. • Serve as a technical and architectural reference, guiding engineers, analysts, and data scientists. • Conduct architecture reviews and propose corporate data blueprints. • Facilitate communication between technical teams and executives. • Implement data CI/CD with GitLab/GitHub, Terraform, and Databricks Asset Bundles. • Ensure data quality through automated tests (DQX, Pytest, SQLFluff). • Support MLOps practices with MLflow, Feature Store, and model monitoring. • Explore the use of GenAI and intelligent automation in governance and quality.
• Proven experience in modern data architecture (Lakehouse, Medallion). • Strong hands-on experience with Databricks (Delta Lake, Unity Catalog, Delta Live Tables, Federation). • Proficiency in Python for pipelines and automations. • Practical experience in multi-cloud environments (AWS, Azure). • Knowledge of Data Governance, security, and compliance.
• Health insurance • Meal/Food voucher (Flash) • Home office allowance • Corporate English classes at subsidized rates • Learning incentive programs (Udemy)
Apply NowOctober 8
201 - 500
Senior Data Engineer responsible for building data pipelines and ensuring data governance at Extractta. Working in a collaborative environment for continuous improvement.
🗣️🇧🇷🇵🇹 Portuguese Required
Airflow
Amazon Redshift
AWS
Azure
BigQuery
Cloud
ETL
Google Cloud Platform
Java
Kafka
NoSQL
Python
Scala
Spark
SQL
October 6
Data Engineer at Paschoalotto, designing and maintaining reliable ETL/ELT data pipelines for innovative digital customer solutions. Collaborating with analysts and architects to ensure data quality and governance.
🗣️🇧🇷🇵🇹 Portuguese Required
Airflow
Apache
Azure
ETL
Hadoop
Kafka
MongoDB
NoSQL
Postgres
Python
RabbitMQ
Redis
Spark
SQL
.NET
October 2
1001 - 5000
Lead data engineering and automation development as a Senior Low-Code Engineer at a global consulting firm. Delivering impactful data insights and solutions using Power Platform and Power BI.
🗣️🇧🇷🇵🇹 Portuguese Required
Azure
Cloud
ETL
Python
SQL
October 1
Data Engineer defining strategic technology and data vision for innovative client solutions at Leega. Collaborating on data architecture and leading technological research and development.
🗣️🇧🇷🇵🇹 Portuguese Required
Airflow
Amazon Redshift
AWS
Cassandra
DynamoDB
EC2
MySQL
NoSQL
Oracle
PySpark
Python
Spark
SQL
October 1
Senior Data Engineer at Oowlish enhancing digital solutions for clients in the US and Europe. Building scalable data pipelines and collaborating across teams in a dynamic environment.
Airflow
Apache
ETL
Python
SQL