GCP Data Engineer – Contractual Role

Job not on LinkedIn

October 12

Apply Now
Logo of Ciphercru Innovation Pvt Ltd

Ciphercru Innovation Pvt Ltd

B2B • SaaS • Artificial Intelligence

Ciphercru Innovation Pvt Ltd is a technology services and software development company that builds custom web and mobile applications, AI/ML integrations, and enterprise solutions such as ERP and e‑commerce platforms. The firm provides cross‑platform development, DevOps/cloud infrastructure, UI/UX design, and technical staffing services, serving startups and enterprises across multiple countries with a team of experienced engineers and a history of client engagements.

📋 Description

• Design, build, and maintain scalable data pipelines on GCP. • Work with Azure Data bricks to support data processing and ML workflows. • Build and manage DBT models for data transformation, quality checks, and reporting. • Handle structured and unstructured data using modern cloud data stack technologies. • Implement secure coding practices and ensure compliance with data privacy standards (PII handling). • Collaborate with data scientists to product ionize machine learning models. • Integrate and analyze data from various retail sources (in-store POS and e-commerce platforms). • Monitor and optimize data workflows for performance, scalability, and cost efficiency. • Work closely with cross-functional teams to deliver robust and reliable data solutions.

🎯 Requirements

• 5+ years of experience in data engineering or related roles. • Strong expertise in Google Cloud Platform (GCP) data services (BigQuery, Data flow, Pub/Sub, Cloud Functions). • Experience working with Azure Data bricks for data processing. • Hands-on experience with DBT (Data Build Tool) in production environments. • Proficiency in Python and SQL for data processing and ETL pipelines. • Experience with data orchestration tools (Airflow, Cloud Composer). • Knowledge of secure coding practices and handling PII data. • Experience working with unstructured data (logs, media, text, etc.). • Understanding of retail datasets (offline and online channels). • Familiarity with data science concepts and supporting ML model product ionization. • Excellent communication and collaboration skills.

Apply Now

Similar Jobs

October 11

ML Data Engineer specializing in analytics, machine learning, and data engineering at Smart Working, a remote-first company focused on global collaboration.

AWS

Azure

Cloud

DynamoDB

ETL

Google Cloud Platform

NoSQL

Numpy

Pandas

Postgres

Python

Scikit-Learn

October 10

CSG

5001 - 10000

Data Architect responsible for designing and implementing database solutions at CSG. Collaborating with architects to direct changes in business processes and technologies.

AWS

Postgres

October 7

Escalent

1001 - 5000

Data Engineer leading projects in data engineering and integration for Escalent, a data analytics firm. Collaborating with clients and utilizing modern data architectures to set up unified consumer views.

Airflow

Azure

Cloud

ETL

Python

SQL

October 7

Escalent

1001 - 5000

Data Engineer leading projects to create Unified Consumer View from multiple data sources. Working in a remote first/hybrid environment in data engineering and integration.

Airflow

Azure

Cloud

ETL

Python

SQL

October 7

Lead Data Engineer I at Ollion designing and implementing data platforms, APIs, and pipelines while leveraging cloud services and ensuring scalable solutions.

Airflow

AWS

Docker

IoT

JavaScript

Kubernetes

Microservices

Python

SDLC

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com