Data Engineer

Job not on LinkedIn

June 14

Apply Now
Logo of VIZX International

VIZX International

Business Process Outsourcing and Recruitment Process Outsourcing Made Easy.

51 - 200 employees

📋 Description

• Architect and maintain a high-performing data warehouse infrastructure on AWS • Design and optimize scalable data models using Amazon Redshift • Build, monitor, and maintain robust ETL/ELT workflows using Apache Airflow • Develop insightful dashboards and visualizations in Amazon QuickSight • Integrate various AWS services (DynamoDB, S3, EventBridge) to support batch and real-time data flows • Collaborate with cross-functional teams to understand data needs and deliver reliable solutions • Monitor infrastructure cost and performance; implement optimizations proactively • Uphold strong data quality, reliability, and security standards throughout the pipeline

🎯 Requirements

• 4+ years of experience in data engineering, with a focus on cloud-based environments • Advanced proficiency in SQL and Python • Proven experience with AWS services including Redshift, Airflow, S3, DynamoDB, EventBridge, and QuickSight • Solid knowledge of data modeling, warehousing principles, and analytics best practices • Familiarity with CI/CD workflows and infrastructure-as-code tools (e.g., Terraform, CloudFormation) • Strong communication and problem-solving skills with a proactive, detail-oriented mindset • Comfortable working independently in a fast-paced, distributed team environment

🏖️ Benefits

• Excellent career growth in a high-impact role • Flexible, remote work environment with autonomy • Access to advanced tools and cloud infrastructure • Competitive compensation and long-term career opportunities

Apply Now

Similar Jobs

June 13

As a Data Engineer at Digital Media Solutions, design and maintain data infrastructure and pipelines to ensure timely data accessibility.

Amazon Redshift

AWS

Cloud

Docker

DynamoDB

ETL

JavaScript

Kubernetes

MySQL

NoSQL

Postgres

Python

SQL

Terraform

June 11

As a Senior Data Engineer, modernize government partners' platforms using inclusive, scalable technology.

Amazon Redshift

Apache

AWS

Cloud

Java

Python

Spark

Go

June 10

GBH is looking for a Senior Data Engineer to work with data pipelines remotely. Collaborate on modern engineering tools and practices to ensure data integrity and performance.

Airflow

Apache

Docker

ElasticSearch

ETL

Kubernetes

Linux

Node.js

Oracle

Postgres

Python

SQL

Unix

June 9

TetraScience seeks a Scientific Data Architect to design solutions for scientific data workflows. Join a leader in scientific data and AI cloud industry.

AWS

Azure

Cloud

Distributed Systems

Google Cloud Platform

Python

Tableau

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com