
IT • B2B • Enterprise
Source2IT is a company that provides tailored IT solutions to enhance the technological development of businesses. They specialize in agile services, IT squads, outsourcing, and infrastructure management, ensuring efficient operation of systems and high-quality project execution through skilled professionals. With over 8 years of experience, they focus on delivering innovative and effective IT expertise while promoting diversity and inclusion within their workforce.
201 - 500 employees
Founded 2016
🤝 B2B
🏢 Enterprise
November 14
🗣️🇧🇷🇵🇹 Portuguese Required

IT • B2B • Enterprise
Source2IT is a company that provides tailored IT solutions to enhance the technological development of businesses. They specialize in agile services, IT squads, outsourcing, and infrastructure management, ensuring efficient operation of systems and high-quality project execution through skilled professionals. With over 8 years of experience, they focus on delivering innovative and effective IT expertise while promoting diversity and inclusion within their workforce.
201 - 500 employees
Founded 2016
🤝 B2B
🏢 Enterprise
• The professional will be responsible for designing, implementing, and operating initiatives in the following areas: • 1. Platform & Infrastructure: Design and standardize the architecture for data ingestion, transformation, and delivery on Snowflake. Define best practices for modeling (data vault, dimensional, lakehouse patterns) and layers (raw, staging, curated, consumption). Integrate pipelines with orchestrators (e.g., Airflow, Prefect) and observability tools. • 2. Data Cataloging: Implement/manage a data catalog (e.g., Collibra, Alation, Amundsen, Data Catalog) and technical and business metadata. Ensure full data lineage between sources, pipelines, and analytical consumption. • 3. Process Standardization: Create policies and playbooks for onboarding new sources, pipeline versioning, and schema approval. Define SLAs/acceptance criteria for data deliveries and documentation templates (data contracts). • 4. Data Quality: Design and operationalize data quality frameworks (e.g., Great Expectations, Deequ) and automated monitoring. Define quality KPIs (completeness, accuracy, temporal conformity) and correction/alert routines. • 5. Data Security: Implement data security and compliance policies (LGPD, access controls, encryption, masking, logging). Manage access governance (RBAC/ABAC), auditing, and environment segregation (dev/test/prod). • 6. Cultural Change and Organizational Governance: Promote analytical culture: KPI definition, training, workshops, and formation of data committees. Act as a bridge between business areas and technical teams, facilitating prioritization and adoption of best practices.
• Minimum of 5 years proven experience in data engineering/analytics with a focus on building pipelines and integration for Snowflake. • Hands-on knowledge of Snowflake (T-SQL/SQL on Snowflake, Snowpipe, zero-copy cloning, roles, Streams & Tasks). • Experience with transformation, orchestration, and code versioning tools. • Experience with catalog/metadata tools (Collibra, Alation, Amundsen or similar) and implementing lineage. • Implementation of data quality frameworks. • Strong understanding of data security and compliance (LGPD), access management (RBAC/ABAC), and auditing best practices. • Strong communication skills and experience conducting workshops, committees, and team enablement. • Technical English (reading and writing) for consulting documentation and reports.
Apply Now