
SaaS ⢠Software Development ⢠AI/ML
Curotec is a product development company specializing in software solutions. They provide teams with expertise in web and mobile development, AI/ML capabilities, and e-commerce solutions. Curotec aims to partner with enterprises and product teams to drive efficiency and innovation in their projects, leveraging their experience in various technologies such as Laravel, Python, and Node. js to ensure successful outcomes for their clients.
September 10

SaaS ⢠Software Development ⢠AI/ML
Curotec is a product development company specializing in software solutions. They provide teams with expertise in web and mobile development, AI/ML capabilities, and e-commerce solutions. Curotec aims to partner with enterprises and product teams to drive efficiency and innovation in their projects, leveraging their experience in various technologies such as Laravel, Python, and Node. js to ensure successful outcomes for their clients.
⢠Support the ingestion, processing, and synchronization of data across the analytics platform ⢠Build and maintain Python Notebooks to ingest data from third-party APIs ⢠Design and implement Medallion layer architecture (Bronze, Silver, Gold) for structured data organization and progressive data refinement ⢠Store and manage data within Microsoft Fabric's Data Lake and Data Warehouse using delta parquet file formats ⢠Set up data pipelines and sync key datasets to Azure Synapse Analytics ⢠Develop PySpark-based data transformation processes across Bronze, Silver, and Gold layers ⢠Collaborate with developers, analysts, and stakeholders to ensure data availability and accuracy ⢠Monitor, test, and optimize data flows for reliability and performance ⢠Document processes and contribute to best practices for data ingestion and transformation
⢠Strong experience with Python for data ingestion and transformation ⢠Proficiency with PySpark for large-scale data processing ⢠Proficiency in working with RESTful APIs and handling large datasets ⢠Experience with Microsoft Fabric or similar modern data platforms ⢠Understanding of Medallion architecture (Bronze, Silver, Gold layers) and data lakehouse concepts ⢠Experience working with Delta Lake and parquet file formats ⢠Understanding of data warehousing concepts and performance tuning ⢠Familiarity with cloud-based workflows, especially within the Azure ecosystem ⢠Hands-on experience working with API-based data ingestion and modern data architectures ⢠Nice to have: Experience with marketing APIs such as Google Ads or Google Analytics 4 ⢠Nice to have: Familiarity with Azure Synapse and Data Factory pipeline design ⢠Nice to have: Understanding of data modeling for analytics and reporting use cases ⢠Nice to have: Experience with AI coding tools ⢠Nice to have: Experience with Fivetran, Aribyte, and Riverly
⢠Competitive salary ⢠Ability to grow and advance your career ⢠Attend virtual developer conferences ⢠Work on cutting-edge and exciting projects
Apply NowSeptember 10
Senior Data Engineer building large-scale AI-driven data infrastructure at Altimate AI. Designing PB-scale pipelines, SQL intelligence, and cloud-native systems overlapping US Pacific Time.
Airflow
AWS
Cloud
Kubernetes
Open Source
Python
SQL
September 9
Lead Data Engineer building and scaling Hopscotch Primary Care's data platform, pipelines, and governance to support care delivery and analytics.
AWS
Azure
Cloud
Google Cloud Platform
PySpark
Python
September 6
Data Architect designing enterprise Databricks Lakehouse for Live Nation Entertainment. Lead architecture, modeling, governance, and mentorship across data platform.
đşđ¸ United States â Remote
đľ $144k - $180k / year
đ° Post-IPO Debt on 2023-01
â° Full Time
đ Senior
đ´ Lead
đ° Data Engineer
AWS
Azure
Cloud
ETL
Hadoop
Kafka
NoSQL
Spark
September 5
Data Engineer building scalable data pipelines and platforms for ForceMetrics, a public-safety data analytics startup. Empowering internal and external stakeholders with reliable data and analytics.
đşđ¸ United States â Remote
đľ $90k - $120k / year
â° Full Time
đĄ Mid-level
đ Senior
đ° Data Engineer
Apache
BigQuery
Cyber Security
Docker
ElasticSearch
Kubernetes
MySQL
Postgres
Python
SQL
Tableau
September 5
Senior Data Engineer building and optimizing AWS data pipelines for a government-focused digital services company. Develop ETL, data architecture, and data quality solutions for federal clients.
đşđ¸ United States â Remote
đľ $113.2k - $127.9k / year
â° Full Time
đ Senior
đ° Data Engineer
Airflow
Amazon Redshift
Apache
AWS
Cloud
EC2
ETL
Hadoop
Java
JavaScript
MySQL
Postgres
Python
Scala
Spark
SQL