Principal Engineer – Enterprise Data Platform, Data Warehouse, Data Modeling, GCP, Azure, AWS

Job not on LinkedIn

October 24

Amazon Redshift

AWS

Azure

BigQuery

Cloud

ERP

ETL

Informatica

Oracle

Python

Scala

Spark

Tableau

Apply Now
Logo of Western Digital

Western Digital

B2C • B2B • Hardware

Western Digital is a leading manufacturer of data storage solutions, including solid state drives (SSDs), hard drives (HDDs), USB flash drives, and memory cards. The company provides a broad range of products for various uses, such as gaming, video surveillance, and data backup. They also offer solutions for data centers, network-attached storage (NAS), and digital photography. Western Digital is renowned for their innovation in developing storage technologies that meet the needs of both consumers and businesses.

10,000+ employees

Founded 1970

👥 B2C

🤝 B2B

🔧 Hardware

💰 $900M Post-IPO Equity on 2023-01

📋 Description

• Provide strategic direction and technical oversight for a significant greenfield initiative focused on rebuilding and modernizing our data warehouse infrastructure. • Possessing deep knowledge of data engineering principles, tools, and technologies. • Providing direction and guidance on data architecture, solution design, and implementation strategies. • Designing and implementing data pipelines, data warehouse, and data lake using modern technology components like Spark, Iceberg, Delta Lake, Scala, Python etc. • Implementing and enforcing data governance policies, including data security controls and privacy regulations. • Define and maintain end to end data flows, Data Lineage, Data Catalog for various data marts • Be a liaison between solution architects, BSA’s and data engineers to ensure compliance to standards of Data integrations, data management and review the data solutions • Maintain inventory and roadmap of data assets across the Data Platform • Ensure best practices for data democratization architecture & data virtualization • Offer insight, guidance, prototype, and direction on the usage of latest trends and technical capabilities for Enterprise Data Management • Stay updated with the latest industry trends and best practices, sharing knowledge and encourage team to continuously improve their skill’s

🎯 Requirements

• Bachelor’s degree or higher in Computer Science or Engineering or related field • Minimum 15+ years of experience working with Data Management including Data Integration/ETL (Informatica, Talend, Fivetran etc.), Data Warehouse/Lakes (Oracle, Snowflake, Databricks, Google Big Query, Redshift), Master Data Management, Data Quality, Data Modeling, Data Analytics/BI, Data Enrichment, Security and Governance. • Minimum of 7+ years of experience focused specifically on Delivery or Solution Architecture of large complex data management programs • Strong data modeling expertise in supporting both classic and modern data Lakehouse architectures. • Strong understanding and experience building ELT/ETL solutions • Experience on various data Quality tools like Informatica Data Quality, Atlan, Collibra, etc. • Demonstrable experience working with data structures coming from variety of ERP, CRM and other data sources • Experience working with at least one major cloud data platforms like AWS, Azure, Google, etc. • Experience working with at least one modern Lakehouse platforms like Databricks, Snowflake, etc. • Experience working with Tableau, Power BI, or other Data visualization tools. • Knowledge of advanced analytics GenAI platforms is a plus. • Develop Entity Relationship Diagrams using and data modeling tools. • Guide technical team to tunes complex solutions, monitor system performance, and provide recommendations and means for improvement • Prototype new technologies and implement innovative solutions to enable teams to consume and understand data faster • Responsible for metadata management of data domain, inclusive of data definitions, data catalog, data lineage and documentation of data flow for critical processes, Sox compliance • Partner with Data Governance analyst and Business Data Stewards • Maintain in-depth understanding of business functions, processes, and relationships as it relates to data

🏖️ Benefits

• Health insurance • 401(k) matching • Flexible work hours • Paid time off • Remote work options

Apply Now

Similar Jobs

October 10

CSG

5001 - 10000

Data Architect responsible for designing and implementing database solutions at CSG. Collaborating with architects to direct changes in business processes and technologies.

🇮🇳 India – Remote

⏰ Full Time

🟠 Senior

🔴 Lead

🚰 Data Engineer

August 9

InfyStrat Software Services

501 - 1000

🔒 Cybersecurity

Lead the architecture planning and design for data solutions in a remote role.

🇮🇳 India – Remote

⏰ Full Time

🟠 Senior

🔴 Lead

🚰 Data Engineer

Amazon Redshift

AWS

Azure

Cloud

ETL

Hadoop

Spark

July 15

Rackspace Technology

5001 - 10000

☁️ SaaS

The Data Engineer manages Databricks and various AWS tools. Remote role with operational support duties.

🇮🇳 India – Remote

⏰ Full Time

🟠 Senior

🔴 Lead

🚰 Data Engineer

AWS

Cloud

SQL

SSIS

Tableau

July 9

Closing Gap

1 - 10

🤝 B2B

🎯 Recruiter

👥 HR Tech

Data Engineer specializing in Fraud Detection and Financial Crime Analytics for financial services. Responsible for designing real-time data pipelines and optimizing fraud detection models.

🇮🇳 India – Remote

⏰ Full Time

🟠 Senior

🔴 Lead

🚰 Data Engineer

July 4

E.L.F. BEAUTY

201 - 500

💄 Beauty

🛒 Retail

🛍️ eCommerce

Join e.l.f. Beauty as a Data Engineer to design and maintain data pipelines and infrastructure.

🇮🇳 India – Remote

💵 ₹3.5M - ₹4.5M / year

💰 $225.2M Post-IPO Secondary on 2017-03

⏰ Full Time

🟠 Senior

🔴 Lead

🚰 Data Engineer

Developed by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com