
PrideLogic is a company that does not currently have detailed information available as their website is under construction. More details may be provided in the future once updates are made available on their site.
11 - 50 employees
September 10

PrideLogic is a company that does not currently have detailed information available as their website is under construction. More details may be provided in the future once updates are made available on their site.
11 - 50 employees
• Build, optimize, and scale data pipelines and infrastructure using Python, TypeScript, Apache Airflow, PySpark, AWS Glue, and Snowflake. • Design, operationalize, and monitor ingest and transformation workflows: DAGs, alerting, retries, SLAs, lineage, and cost controls. • Collaborate with platform and AI/ML teams to automate ingestion, validation, and real-time compute workflows; work toward a feature store. • Integrate pipeline health and metrics into engineering dashboards for full visibility and observability. • Model data and implement efficient, scalable transformations in Snowflake and PostgreSQL. • Build reusable frameworks and connectors to standardize internal data publishing and consumption.
• 4+ years of production data engineering experience. • Deep, hands-on experience with Apache Airflow, AWS Glue, PySpark, and Python-based data pipelines. • Strong SQL skills and experience operating PostgreSQL in live environments. • Solid understanding of cloud-native data workflows (AWS preferred) and pipeline observability (metrics, logging, tracing, alerting). • Proven experience owning pipelines end-to-end: design, implementation, testing, deployment, monitoring, and iteration. • Experience with Snowflake performance tuning (warehouses, partitions, clustering, query profiling) and cost optimization. • Real-time or near-real-time processing experience (e.g., streaming ingestion, incremental models, CDC). • Hands-on experience with a backend TypeScript framework (e.g., NestJS) is a strong plus. • Experience with data quality frameworks, contract testing, or schema management (e.g., Great Expectations, dbt tests, OpenAPI/Protobuf/Avro). • Background in building internal developer platforms or data platform components (connectors, SDKs, CI/CD for data).
Apply NowSeptember 10
Senior Golang engineer building AWS microservices for GFT's BatchOnCloud platform. Implement COBOL/PySpark compatibility and support batch troubleshooting.
🗣️🇧🇷🇵🇹 Portuguese Required
September 10
Senior .NET Backend Developer at GFT leading microservices architecture and technical teams. Focus on .NET Core, cloud, containers, and dev practices.
🗣️🇧🇷🇵🇹 Portuguese Required
September 10
Develop Kotlin/Java microservices, CI/CD, AWS, observability; lead architecture and mentor teams at GFT.
🗣️🇧🇷🇵🇹 Portuguese Required
September 9
Build and operate ETL pipelines with PySpark, Airflow, and AWS Glue; model data in Snowflake and PostgreSQL for analytics and ML.
September 9
Senior Java Developer building scalable Java microservices for benefits-sector client at CI&T. Focus on APIs, architecture, cloud, and automated testing.
🗣️🇧🇷🇵🇹 Portuguese Required