
B2B • Recruitment
Tecla T is an IT outsourcing and staffing firm that selects and allocates technology professionals for companies worldwide. The company supplies talent across backend, frontend, mobile, automated testing, QA, DevOps, design and management, and supports projects on three continents (Oceania, Europe and the Americas). Tecla T emphasizes high customer and employee satisfaction, low turnover, and global delivery of IT teams and projects.
November 26
🗣️🇧🇷🇵🇹 Portuguese Required

B2B • Recruitment
Tecla T is an IT outsourcing and staffing firm that selects and allocates technology professionals for companies worldwide. The company supplies talent across backend, frontend, mobile, automated testing, QA, DevOps, design and management, and supports projects on three continents (Oceania, Europe and the Americas). Tecla T emphasizes high customer and employee satisfaction, low turnover, and global delivery of IT teams and projects.
• Work on building an integrated financial data platform. • Map and refine data sources. • Identify and document current sources (KMM, Rodopar). • Assess granularity, format, and quality of available data. • Create a data dictionary and a glossary of financial metrics. • Implement the Data Architecture. • Develop automated ingestion pipelines (ETL/ELT). • Process data with cleaning, standardization, and enrichment. • Model financial data (staging, processing, and cube layers). • Define the logical model of the semantic layer (facts, dimensions, and hierarchies). • Normalize and standardize naming conventions to ensure consistency. • Build the fact layer with appropriate granularity.
• 3–5 years of experience in data engineering. • Experience developing and maintaining ETL/ELT pipelines. • Strong SQL skills and experience handling large volumes of data. • Experience with data processing languages such as Python or Scala. • Experience with data modeling: staging, processing, and consumption layers. • Creation of fact and dimension tables. • Understanding of granularity and hierarchies. • Familiarity with orchestration and ingestion tools (e.g., Airflow, Databricks, ADF, Glue, or equivalents). • Experience with relational databases and data warehouse environments. • Ability to document data sources, business rules, and data structures. • Knowledge of data governance, cataloging, and best practices. • Experience with data lake environments or distributed architectures. • Basic knowledge of CI/CD and version control. • Prior experience with financial data is a plus.
• Remote
Apply NowNovember 18
Interview Engineer facilitating technical interviews with software developers at Karat. Creating inclusive hiring processes through feedback on candidates' technical skills and knowledge evaluation.
September 24
AWS
Cloud
Google Cloud Platform
Python