Senior Data Engineer – AWS

Job not on LinkedIn

June 19

Apply Now
Logo of Aptus Data Labs

Aptus Data Labs

Artificial Intelligence • SaaS • B2B

Aptus Data Labs is a data engineering and enterprise AI company that builds scalable AI platforms, generative intelligence solutions, and data modernization services for large organizations. The company delivers industry-focused AI and analytics products (including aptplan, aptGenAI and other platforms) and services—covering advisory, cloud migration, MLOps/LLMOps, AI governance, and on-demand talent—to help pharmaceutical, banking, manufacturing, retail and other enterprises accelerate decision-making, compliance, and operational efficiency. Aptus partners with cloud and AI providers, offers pre-built accelerators and IP, and focuses on B2B deployments and enterprise-scale SaaS solutions.

📋 Description

• Design and develop reliable, reusable ETL/ELT pipelines using AWS Glue, Python, and Spark. • Process structured and semi-structured data (e.g., JSON, Parquet, CSV) efficiently for analytics and AI workloads. • Build automation and orchestration workflows using Airflow or AWS Step Functions. • Implement AWS-native data lake/lakehouse architectures using S3, Redshift, Glue Catalog, and Lake Formation. • Consolidate data from APIs, on-prem systems, and third-party sources into a centralized platform. • Optimize data models and partitioning strategies for high-performance queries. • Ensure secure data architecture practices across AWS components using encryption, access control, and policy enforcement. • Collaborate with platform and security teams to maintain compliance and audit readiness (e.g., HIPAA, GxP).

🎯 Requirements

• Bachelor’s degree in Computer Science, Engineering, or equivalent. • 5–8 years of experience in data engineering, preferably in AWS cloud environments. • Proficient in Python, SQL, and AWS services: Glue, Redshift, S3, IAM, Lake Formation. • Experience managing IAM roles, security policies, and cloud-based data access controls. • Hands-on experience with orchestration tools like Airflow or AWS Step Functions. • Exposure to CI/CD practices and infrastructure automation. • Strong interpersonal and communication skills—able to convey technical ideas clearly.

🏖️ Benefits

• Health insurance • 401(k) matching • Flexible work hours • Paid time off • Remote work options

Apply Now

Similar Jobs

June 18

Lead Data Engineer for a growing team at Forbes Advisor, focusing on data engineering best practices.

Airflow

BigQuery

ETL

Google Cloud Platform

Kafka

Python

Spark

SQL

Tableau

June 11

Inorg Global seeks a Data Engineer to build Databricks pipelines for analytics and ML.

Airflow

Apache

AWS

Azure

Cloud

ETL

Google Cloud Platform

Prometheus

Python

Scala

Spark

SQL

May 15

As a Senior Data Engineer at DataRobot, you will develop analytic data products in a cloud environment. This role requires strong data engineering skills and collaboration with analysts and scientists.

Airflow

Amazon Redshift

AWS

Azure

Cloud

EC2

ETL

Google Cloud Platform

Postgres

Python

Scala

Spark

SQL

Terraform

April 30

Join Hitachi Solutions as an Azure Data Architect, designing scalable data solutions on Microsoft Azure.

Azure

Cloud

ETL

MS SQL Server

Oracle

Python

RDBMS

Scala

Spark

SQL

Tableau

Unity

April 22

Join Zingtree as a Senior Data Engineer to design and build data systems for process automation.

Apache

AWS

Cloud

Kafka

Kubernetes

Spark

SQL

Go

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com