AWS DataHub Developer

Job not on LinkedIn

August 29

Apply Now
Logo of SAPSOL Technologies Inc. : Systems and Process Solutions for your Enterprise

SAPSOL Technologies Inc. : Systems and Process Solutions for your Enterprise

Enterprise • SaaS • Artificial Intelligence

SAPSOL Technologies Inc. is a leader in innovative business transformation, expertly integrating ERP systems, AI, cloud computing, and DevOps to deliver cutting-edge solutions for businesses of all sizes. Specializing in S/4HANA implementation, system migration, and managed support services, SAPSOL ensures seamless integration and optimization of business processes. The company offers advanced business intelligence tools for actionable insights, enabling smarter decision-making and performance enhancement across key operational areas. SAPSOL excels in rapid go-to-market strategies, managing complex global rollouts with precision. Their expertise in process optimization, automation, change management, emerging technology, cloud, AI, and analytics transforms supply chains, financial reporting, and customer service to enhance efficiency and reduce costs. SAPSOL also provides a range of services including security, cloud, and big data analytics, supporting enterprises in staying agile and competitive. The SAPSOL Product Innovation Lab (SPIL) drives advancements like HireRig, an AI-driven recruitment platform. Based in North America, SAPSOL collaborates with startups and invests in disruptive ideas, leveraging partnerships with SAP, Oracle, IBM, and Microsoft technologies to shape the future of business transformation.

📋 Description

• Senior AWS DataHub Developer to design and build real-time, event-driven data services on AWS. This role is developer-first (application-side) rather than infrastructure-led. You will architect and deliver Kafka-based streaming pipelines and serverless data applications that ingest, transform, and serve data at scale. • You’ll collaborate with architects, data engineers, and product teams to deliver secure, resilient, observable, and highly scalable solutions that power enterprise-grade analytics and event-driven applications. • What You’ll Do (Key Responsibilities) • Design & Deliver Event-Driven Pipelines: Build serverless data flows using AWS Lambda, Step Functions, EventBridge, SNS, SQS, API Gateway. • Real-Time Streaming: Develop Kafka (Apache Kafka/Amazon MSK) consumers/producers for high-throughput, low-latency streaming and decoupled microservices. • Microservices & APIs: Build and optimize TypeScript (preferred) or Python services/APIs for data ingestion, transformation, and delivery. • AWS Data Services Integration: Work with S3, DynamoDB, Glue, Athena, CloudWatch for storage, metadata, querying, and observability. • Quality & Reliability: Implement idempotency, retries, dead-letter queues, exactly-once/at-least-once semantics where appropriate, and schema evolution strategies. • CI/CD & Testing: Use Git-based workflows and CI/CD (e.g., GitHub Actions, Jenkins) with automated tests (unit/integration/load) and infrastructure deployments. • IaC (Developer View): Define application-layer infrastructure using AWS CDK, Terraform, or CloudFormation—with strong emphasis on developer productivity and repeatability. • Agile Collaboration: Contribute to technical design, story sizing, peer reviews, and continuous improvement. • Ideal Candidate Profile: You are a cloud-native, application-side developer who thinks in events, streams, and services—not servers. You design for resiliency, observability, and scale, and you’re comfortable pairing Kafka with AWS serverless to deliver business outcomes.

🎯 Requirements

• Design & Deliver Event-Driven Pipelines: Build serverless data flows using AWS Lambda, Step Functions, EventBridge, SNS, SQS, API Gateway. • Real-Time Streaming: Develop Kafka (Apache Kafka/Amazon MSK) consumers/producers for high-throughput, low-latency streaming and decoupled microservices. • Microservices & APIs: Build and optimize TypeScript (preferred) or Python services/APIs for data ingestion, transformation, and delivery. • AWS Data Services Integration: Work with S3, DynamoDB, Glue, Athena, CloudWatch for storage, metadata, querying, and observability. • Quality & Reliability: Implement idempotency, retries, dead-letter queues, exactly-once/at-least-once semantics where appropriate, and schema evolution strategies. • CI/CD & Testing: Use Git-based workflows and CI/CD (e.g., GitHub Actions, Jenkins) with automated tests (unit/integration/load) and infrastructure deployments. • IaC (Developer View): Define application-layer infrastructure using AWS CDK, Terraform, or CloudFormation—with strong emphasis on developer productivity and repeatability. • Agile Collaboration: Contribute to technical design, story sizing, peer reviews, and continuous improvement.

Apply Now

Similar Jobs

August 26

M365/SharePoint Developer building Microsoft 365 solutions for Orangutech Operations. Contract role, fully remote within Canada.

April 30

Join Groupe MalOPlus™ Inc. to analyze, integrate and develop e-commerce web solutions.

🗣️🇫🇷 French Required

JavaScript

MySQL

PHP

TypeScript

April 26

Technitask

11 - 50

Develop data models on Datasphere for clients’ self-service analytics needs. Responsible for data architecture and reporting infrastructure creation.

Cloud

ETL

April 11

Remote Splunk Developer responsible for managing and optimizing Splunk AWS environments. Support full system engineering life-cycle with a focus on metrics and event solutions.

AWS

Cloud

Splunk

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com