
SaaS • B2B • Enterprise
Truelogic Software is a nearshore software development company specializing in agile staff augmentation services. They focus on providing custom outsourced software development with a team of highly skilled engineers from Latin America. Truelogic Software partners with both startups and Fortune 500 companies, offering solutions that align with their clients' time zones and ensuring high-quality outcomes through collaboration and responsiveness. With a presence in over 25 countries, Truelogic emphasizes remote work for better quality of life, and their engineers are experienced in various industries, delivering a wide range of successful projects globally.
3 hours ago

SaaS • B2B • Enterprise
Truelogic Software is a nearshore software development company specializing in agile staff augmentation services. They focus on providing custom outsourced software development with a team of highly skilled engineers from Latin America. Truelogic Software partners with both startups and Fortune 500 companies, offering solutions that align with their clients' time zones and ensuring high-quality outcomes through collaboration and responsiveness. With a presence in over 25 countries, Truelogic emphasizes remote work for better quality of life, and their engineers are experienced in various industries, delivering a wide range of successful projects globally.
• Lead the design and implementation of scalable data pipelines using Databricks and GCP (BigQuery, Cloud Storage, Cloud Composer, etc.). • Build and optimize data transformations, ingestion layers, and processing workflows to support clean room and identity graph workloads. • Develop identity graph structures, identity stitching processes, and privacy-preserving data integrations. • Partner with product managers, data scientists, and senior technical leaders to understand data requirements and deliver robust engineering solutions. • Present technical designs, project plans, and effort estimates to senior leadership with clarity and confidence. • Ensure pipeline reliability, cost optimization, and performance tuning across GCP and Databricks environments. • Collaborate with cross-functional teams to design and support consumer identity, audience modeling, and clean room use cases. • Maintain clear documentation of architectures, workflows, standards, and best practices.
• 5+ years of experience in Data Engineering, building scalable data platforms and pipelines. • Strong, hands-on expertise in Databricks, GCP and BigQuery • Deep understanding of distributed systems, data architecture, and ETL/ELT patterns. • Experience with identity resolution, identity graphs, or consumer data engineering is a strong plus. • Excellent problem-solving skills with attention to detail. • Strong communication skills with the ability to articulate work to technical and non-technical leaders. • Ability to work independently, drive complex initiatives, and take ownership of deliverables. • Bachelor’s degree in Computer Science or a related field.
• 100% Remote Work: Enjoy the freedom to work from the location that helps you thrive. All it takes is a laptop and a reliable internet connection. • Highly Competitive USD Pay: Earn an excellent, market-leading compensation in USD, that goes beyond typical market offerings. • Paid Time Off: We value your well-being. Our paid time off policies ensure you have the chance to unwind and recharge when needed. • Work with Autonomy: Enjoy the freedom to manage your time as long as the work gets done. Focus on results, not the clock. • Work with Top American Companies: Grow your expertise working on innovative, high-impact projects with Industry-Leading U.S. Companies.
Apply Now2 days ago
Data Engineer role at Keyrus involving data integration and analysis for innovative projects. Join a dynamic team working remotely with global customers on data solutions.
🗣️🇪🇸 Spanish Required
AWS
Azure
Cloud
ETL
SQL
3 days ago
Data Engineer designing and building modern data platforms for risk and compliance initiatives. Collaborating with cross-functional teams to provide scalable data solutions.
Amazon Redshift
AWS
Distributed Systems
ETL
Hadoop
HBase
Python
Scala
Spark
SQL
3 days ago
Data Operations Engineer building and maintaining robust data pipelines. Ensuring data quality for analytics and machine learning initiatives in a collaborative environment.
Airflow
AWS
Azure
Cloud
Python
Spark
SQL
November 26
Senior Data Engineer developing scalable data pipelines for Valtech's global clients. Driving innovation and collaboration in data engineering to enhance AI capabilities.
Airflow
Apache
Azure
BigQuery
Cassandra
Cloud
ETL
Google Cloud Platform
Java
Kafka
MongoDB
MySQL
NoSQL
Postgres
Python
Scala
Spark
SQL
November 21
Data Engineer at Yuno, focusing on ETL processes and scalable data solutions for payment infrastructure. Join a remote team building high-performance payment capabilities globally.
ETL
Java
Python
Spark
SQL