GCP Data Architect

Job not on LinkedIn

3 hours ago

Apply Now
Logo of Neurons Lab

Neurons Lab

Neurons Lab is a globally distributed AI R&D company that helps deep tech innovators to accelerate data-driven products development and launch. Our team has expertise in fundamental sciences, full-stack AI/ML engineering, and product design. Such a rare combination and access to scarce talent allows Neurons Lab to build disruptive solutions for clients in HealthTech and EnergyTech industries. Neurons Lab operates within a proprietary delivery framework that is tailored to the innovation environment: fierce competition, tight timelines, little-to-none datasets, and the necessity to generate novel solutions.

51 - 200 employees

💰 Corporate Round on 2022-10

📋 Description

• Design end-to-end data architectures combining GCP data services (BigQuery, Dataflow, Data Catalog, Dataplex) with on-premise systems (ex. Oracle) • Establish data governance frameworks with cataloging, lineage, and quality controls • Build production data pipelines based on approved architectures • Implement data warehouses: schema creation, partitioning, clustering, optimization, security setup • Deploy data governance frameworks: Data Catalog configuration, metadata tagging, lineage tracking, quality monitoring

🎯 Requirements

• 7+ years in data architecture, data engineering, or solution architecture roles • 4+ years hands-on with GCP data services (BigQuery, Dataflow, Data Catalog, Dataplex) - production implementations • 3+ years in data governance (MANDATORY) - metadata management, data lineage, data quality frameworks, data cataloging • 3+ years in BFSI/Banking domain (MANDATORY) - AML, KYC, regulatory reporting, compliance requirements • 5+ years with SQL and relational databases - complex query writing, optimization, performance tuning • 3+ years in data modeling - dimensional modeling, data vault, or other data warehouse methodologies • 2+ years in presales/architecture roles - requirements gathering, solution design, client presentations • Experience with on-premise data platforms (MANDATORY) - Ex. Teradata, Oracle, SQL Server integration with cloud

🏖️ Benefits

• Part-time long-term engagement with project-based allocations • Direct report to Head of Cloud

Apply Now

Similar Jobs

3 days ago

ERP Data Migration Consultant extracting and validating financial data for migration. Collaborating with project teams to ensure accuracy and timely transitions for clients.

ERP

Oracle

3 days ago

Data Engineer building scalable data pipelines for Luxor in the Bitcoin mining and compute industry. Collaborating on architecture and driving near real-time data systems.

Airflow

Cloud

ETL

Postgres

Python

October 30

Lead Data Engineer at Boldr shaping data ecosystems, optimizing pipelines, and collaborating across teams.

Airflow

AWS

Azure

Cloud

ETL

Google Cloud Platform

Postgres

Python

SQL

October 25

Data Engineer skilled in Python for data exploration, analysis, and pipeline development at data-driven organization. Collaborate with teams to derive insights and build data solutions.

ETL

Numpy

Pandas

PySpark

Python

SQL

October 24

Senior Data Engineer designing and developing data solutions for Balsam Brands. Responsible for the data infrastructure and collaborating with various stakeholders.

AWS

Azure

Cloud

ETL

Google Cloud Platform

MySQL

NoSQL

Oracle

Postgres

Python

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com