GCP Data Engineer

Job not on LinkedIn

November 4

Apply Now
Logo of NextHire

NextHire

Nexthire is a technology driven recruitment platform that enables companies to hire 2x faster with its advanced algorithms based on years of data.The leadership team has several decades of collective experience working for top product development companies in India. We believe in breaking the conventional ideas that have been rooted in the staffing industry. We provide end to end recruitment solutions and are developing amazing tools that will help shape the future of recruitment industry.Trusted by 100+ brands, MNCs and startups alike, we are creating a hiring experience like never before.

11 - 50 employees

📋 Description

• Design, build, and maintain scalable data pipelines and workflows. • Develop and optimize ETL/ELT processes using Python and workflow automation tools. • Implement and manage data integration between various systems, including APIs and Oracle EBS. • Work with Google Cloud Platform (GCP) or Google BigQuery (GBQ) for data storage, processing, and analytics. • Utilize Apache Spark or similar big data frameworks for efficient data processing. • Develop robust API integrations for seamless data exchange between applications. • Ensure data accuracy, consistency, and security across all systems. • Monitor and troubleshoot data pipelines, identifying and resolving performance issues. • Collaborate with data analysts, engineers, and business teams to align data solutions with business goals. • Document data workflows, processes, and best practices for future reference.

🎯 Requirements

• Strong proficiency in Python for data engineering and workflow automation. • Experience with workflow orchestration tools (e.g., Apache Airflow, Prefect, or similar). • Hands-on experience with Google Cloud Platform (GCP) or Google BigQuery (GBQ). • Expertise in big data processing frameworks, such as Apache Spark. • Experience with API integrations (REST, SOAP, GraphQL) and handling structured/unstructured data. • Strong problem-solving skills and ability to optimize data pipelines for performance. • Experience working in an agile environment with CI/CD processes. • Strong communication and collaboration skills.

🏖️ Benefits

• Competitive compensation and benefits • Professional growth opportunities with exposure to the latest technologies

Apply Now

Similar Jobs

November 3

Data Engineer developing high-performance data pipelines using Oracle solutions at Avery Dennison. Collaborating with data teams to enhance operational efficiency and value for customers.

Azure

Cloud

ETL

Oracle

SQL

October 31

Senior Data Engineer designing and optimizing data pipelines in Snowflake and Azure for a wealth management firm. Responsible for data architecture, migration, and automation to drive analytics and insights.

Airflow

Apache

Azure

Cloud

ETL

Python

SQL

October 31

Cloud Data Architect shaping cloud data strategy with Azure and Snowflake at Smart Working. Leading data architecture practices to enhance performance, security, and compliance in wealth management.

Azure

Cloud

Terraform

October 28

Azure DataOps Lead responsible for operational delivery and automation of Azure-based data platforms for enterprise clients. Driving collaboration among Data Engineering, Cloud, and Business teams.

Azure

Cloud

Python

SQL

Vault

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com