Data Engineer

Job not on LinkedIn

October 1

Apply Now
Logo of Remotebase

Remotebase

HR Tech • Recruitment • SaaS

Remotebase is a platform that connects companies with elite, vetted software developers from around the globe. The company provides a diverse pool of developers with skills in areas like Python, JavaScript, AI, computer vision, and more, thoroughly vetted for their expertise and reliability. They streamline the hiring process by offering a curated selection of candidates, allowing companies to save time and focus on project-critical needs. Remotebase makes it easy for businesses to find and hire top talent for their tech projects by offering a simple three-step process: sharing needs, matching with developers, and interviewing to select the best fit.

📋 Description

• Design, build, and maintain scalable and resilient CI/CD pipelines for data applications and infrastructure, with a focus on Snowflake, dbt, and related data tools. • Implement and manage Snowflake dbt projects for data transformation, including developing dbt models, tests, and documentation, and integrating dbt into CI/CD workflows. • Develop and manage infrastructure as code (IaC) using Terraform to provision and configure cloud resources for data storage, processing, and analytics on GCP. • Automate the deployment, monitoring, and management of Snowflake data warehouse environments, ensuring optimal performance, security, and cost-effectiveness. • Collaborate with data engineers and data scientists to understand their requirements and provide robust, automated solutions for data ingestion, processing, and delivery. • Implement and manage monitoring, logging, and alerting systems for data pipelines and infrastructure to ensure high availability and proactive issue resolution. • Develop and maintain robust automation scripts and tools, primarily using Python, to streamline operational tasks, manage data pipelines, and improve efficiency; Bash scripting for system-level tasks is also required. • Ensure security best practices are implemented and maintained across the data infrastructure and pipelines. • Troubleshoot and resolve issues related to data infrastructure, pipelines, and deployments in a timely manner. • Participate in code reviews for infrastructure code, dbt models, and automation scripts. • Document system architectures, configurations, and operational procedures. • Stay current with emerging DevOps technologies, data engineering tools, and cloud best practices, particularly related to Snowflake, dbt, and Terraform. • Optimize data pipelines for performance, scalability, and cost. • Support and contribute to data governance and data quality initiatives from an operational perspective. • Help implement AI features

🎯 Requirements

• Bachelor's degree in Computer Science, Engineering, or a related technical field expertise which is equivalent. • 5+ years of hands-on experience in a DevOps, SRE, or infrastructure engineering role. • 3+ years of experience specifically focused on automating and managing data infrastructure and pipelines. • 1+ years of experience enabling AI features • **Others:** • Strong, demonstrable experience with Infrastructure as Code tools, particularly Terraform. • Strong background in DevOps principles and practices, and hands-on experience in building business intelligence solutions. • Highly experienced in automation, and problem-solving skills with proficiency in cloud technologies. • Ability to collaborate effectively with data engineers, analysts, and other stakeholders to ensure the reliability and performance of our data ecosystem. • Proven experience with dbt for data transformation, including developing models, tests, and managing dbt projects in a production environment. • Hands-on experience managing and optimizing Snowflake data warehouse environments. • Demonstrable experience with data modeling techniques for ODS, dimensional modeling (Facts, Dimensions), and semantic models for analytics and BI. • Strong proficiency in Python for automation, scripting, and data-related tasks. Experience with relevant Python libraries is a plus. Strong Bash scripting. • Solid understanding of CI/CD principles and tools (e.g., Bitbucket Runners, Jenkins, GitLab CI, GitHub Actions, Azure DevOps). • Experience with cloud platforms (GCP preferred, AWS, or Azure) and their data services. • Experience with containerization technologies (e.g., Docker, Kubernetes) is a plus. • Knowledge of data integration tools and ETL/ELT concepts. • Familiarity with monitoring and logging tools. • Strong SQL skills. • Ability to work independently and as part of a collaborative team in an agile environment. • Strong communication skills, with the ability to explain complex technical concepts clearly.

🏖️ Benefits

• Fully remote with office optional. You decide when you would like to work from home and when from the office. • Flexible timings. You decide your work scheduled. • Market competitive compensation (in $$). • Insane learning and growth

Apply Now

Similar Jobs

September 30

Design and maintain GCP/AWS data pipelines for Leega Consultoria. Build data models, ETL/ELT processes, and ensure governance for analytics.

🗣️🇧🇷🇵🇹 Portuguese Required

AWS

Cloud

ETL

Google Cloud Platform

NoSQL

SQL

Tableau

September 30

Senior Data Engineer leading SAS-to-GCP migrations for Leega, a data analytics consultancy. Designing BigQuery models, building ETL pipelines, and optimizing cloud infrastructure.

🗣️🇧🇷🇵🇹 Portuguese Required

Apache

BigQuery

Cloud

ETL

Google Cloud Platform

Hadoop

Java

PySpark

Python

Scala

Shell Scripting

Spark

SQL

Terraform

September 30

Build and maintain data pipelines and cloud platforms for Dadosfera's Data & AI platform; lead client projects and design scalable data architectures.

Airflow

Apache

Cloud

Kubernetes

Linux

Python

Scala

Spark

SQL

Tableau

Terraform

September 28

Build data platform, APIs and MLOps for Experian Brazil; integrate batch and real-time pipelines and collaborate with business and tech teams.

🗣️🇧🇷🇵🇹 Portuguese Required

Airflow

AWS

DynamoDB

Python

Spark

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com