Senior Data Engineer, Data & AI

November 9

🇨🇴 Colombia – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

Apply Now
Logo of Gorilla Logic

Gorilla Logic

SaaS • Enterprise • Artificial Intelligence

Gorilla Logic is a company renowned for its expertise in modern software and data engineering. Serving as a strategic partner rather than just a vendor, Gorilla Logic specializes in digital product design, cloud engineering, data and AI delivery, DevOps, quality assurance, and legacy modernization. With a team of skilled digital product designers, solutions architects, and Agile nearshore teams, Gorilla Logic has been instrumental in developing business-critical software applications for Fortune 500 and SMB companies for over 20 years. Their services include creating SaaS platforms, enhancing digital experiences, and providing flexible, security-focused solutions. Gorilla Logic operates with teams located in Costa Rica, Colombia, Mexico, and the United States, emphasizing collaborative partnerships to deliver cutting-edge digital engineering solutions.

📋 Description

• Design, build, and maintain scalable and resilient CI/CD pipelines for data applications • Implement and manage Snowflake dbt projects for data transformation • Develop and manage infrastructure as code (IaC) using Terraform • Automate deployment, monitoring, and management of Snowflake data warehouse environments • Collaborate with data engineers and data scientists to provide robust solutions • Implement monitoring, logging, and alerting systems for data pipelines • Develop and maintain robust automation scripts primarily using Python • Ensure security best practices across data infrastructure • Troubleshoot and resolve issues related to data infrastructure • Document system architectures and operational procedures • Stay current with emerging DevOps technologies and best practices • Optimize data pipelines for performance, scalability, and cost

🎯 Requirements

• Bachelor's degree in Computer Science, Engineering, or related technical field • 5+ years of hands-on experience in a DevOps, SRE, or infrastructure engineering role • 3+ years of experience specifically focused on automating and managing data infrastructure and pipelines • 1+ years of experience enabling AI features • Strong, demonstrable experience with Infrastructure as Code tools, particularly Terraform • Strong background in DevOps principles and practices • Highly experienced in automation, with proficiency in cloud technologies • Proven experience with dbt for data transformation in a production environment • Hands-on experience managing and optimizing Snowflake data warehouse environments • Demonstrable experience with data modeling techniques • Strong proficiency in Python for automation and related tasks • Solid understanding of CI/CD principles and tools • Experience with cloud platforms (GCP preferred, AWS, or Azure) and their data services • Strong SQL skills

🏖️ Benefits

• Health insurance • Paid time off • Professional development opportunities

Apply Now

Similar Jobs

November 8

Senior Data Engineer building and maintaining data pipelines for Xeus Solutions. Designing workflows, ensuring data quality, and collaborating with cross-functional teams.

🇨🇴 Colombia – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

Airflow

Amazon Redshift

AWS

Cloud

DynamoDB

MySQL

Postgres

Python

SQL

November 8

Data Engineer at Influur redefining advertising through creators and AI. Opportunity to build significant technology in influencer marketing.

🇨🇴 Colombia – Remote

💰 $5M Seed Round on 2022-04

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

Airflow

AWS

Cloud

Docker

ETL

Google Cloud Platform

Python

SQL

October 31

Data Engineer IV building and maintaining data infrastructure for Propelus's healthcare solutions. Role involves designing robust data pipelines and optimizing database performance.

Airflow

Amazon Redshift

AWS

Azure

BigQuery

Cloud

Docker

ETL

Hadoop

Java

Kafka

Kubernetes

MySQL

Oracle

Postgres

Python

Scala

Spark

SQL

October 29

Data Engineer at Ekumen maintaining and enhancing data ingestion pipelines with Java and Python. Collaborating with data scientists and engineers to improve data flows and observability.

Apache

AWS

Azure

Cloud

Google Cloud Platform

Java

Python

Spark

October 28

Data Architect designing and implementing data architectures in a remote setting. Focused on ETL processes and data analysis with collaboration across teams.

🗣️🇪🇸 Spanish Required

AWS

ETL

Java

Postgres

Python

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com