Mid/Senior Data Engineer, AWS, MongoDB

Job not on LinkedIn

August 19

Apply Now
Logo of Curotec

Curotec

SaaS • Software Development • AI/ML

Curotec is a product development company specializing in software solutions. They provide teams with expertise in web and mobile development, AI/ML capabilities, and e-commerce solutions. Curotec aims to partner with enterprises and product teams to drive efficiency and innovation in their projects, leveraging their experience in various technologies such as Laravel, Python, and Node. js to ensure successful outcomes for their clients.

51 - 200 employees

Founded 2010

☁️ SaaS

📋 Description

• Participate in the PostgreSQL/MySQL to MongoDB migration and build product‑grade ingestion pipelines for form fills and metadata from a suite of products, enabling seamless ingestion and metrics for internal teams and the client admin portal.

🎯 Requirements

• Plan/execute phased migration (data modeling, backfills, CDC/dual‑writes, validation, cutover). • Own product data ingestion: capture form fills and product metadata via APIs/webhooks/streams; define schemas/data contracts; ensure idempotent, validated, PII‑safe loads into MongoDB/Timescale. • Enable Power BI datasets, dashboards, and reports for internal KPIs. • Implement data quality & observability (tests, reconciliation, lineage, SLAs/SLOs). • Apply security & compliance best practices; complete HIPAA training (provided). • Nice to have TimescaleDB familiarity (hypertables, retention, compression).

🏖️ Benefits

• Competitive pay • Ability to grow and advance your career • Work on cutting edge and exciting projects

Apply Now

Similar Jobs

August 19

Hands-on data engineering at Illumination Works; build pipelines, model data, and support analytics with DoD/intel focus.

Apache

AWS

Azure

Cloud

ETL

Google Cloud Platform

Hadoop

Python

August 15

Data Engineer at Human Interest builds scalable data infrastructure and pipelines. Embedded in the Data Analytics team to empower analytics with robust data tooling.

Airflow

AWS

Cloud

ETL

Python

React

SQL

Terraform

August 15

Remote Data Engineer building scalable data pipelines with Snowflake/DBT; role can be based in Latin America or the US, PT hours.

AWS

Cloud

ETL

Node.js

SQL

August 13

Allata

201 - 500

🤝 B2B

Lead Data Engineer Databricks at Allata builds data pipelines for enterprise platforms. Contributes to analytics, data lakehouse implementations, and AI-driven decision support.

AWS

Azure

Cloud

ETL

Jenkins

MS SQL Server

Oracle

PySpark

Spark

SQL

August 13

Senior Data Engineer at Bitly designs and builds a next-gen data platform powering analytics, experimentation, and AI features; collaborates across teams to deliver data to internal and customer applications.

Airflow

Cloud

Docker

Google Cloud Platform

Kubernetes

Python

Spark

SQL

Terraform

Go

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com