Data Architecture and Migration Lead

4 days ago

Apply Now
Logo of Revolgy

Revolgy

Cloud Services • Enterprise • Technology Consulting

Revolgy is a cloud services company that specializes in providing a wide range of on-demand cloud solutions. They are a leading Google Cloud Premier Partner and an APN Advanced Consulting Partner, assisting businesses at every stage of their cloud journey. Revolgy offers services such as cloud consulting, build and migration, cloud optimization, and cloud operations. They cater to various industries including e-commerce, retail, financial services, and gaming. Additionally, Revolgy emphasizes the use of advanced technologies like machine learning, AI, and data analytics to enhance their offerings. With a focus on security and efficiency, they serve over 2,500 customers across 47 countries, providing expert solutions tailored to meet diverse business needs.

51 - 200 employees

🏢 Enterprise

đź’° Private Equity Round on 2020-06

đź“‹ Description

• Strategic Leadership & Migration Design: Define the Path: Design the end-to-end migration strategy (e.g., lift-and-shift vs. re-platforming), selecting the right GCP services (BigQuery, Dataflow, Composer) to replace legacy Spark/Databricks logic. • Project Ownership: Lead the technical roadmap, ensuring milestones are realistic and delivered on time. • Risk Management: Identify technical risks early (e.g., feature parity gaps, cost spikes) and design mitigation plans. • Customer & Stakeholder Management: Trusted Advisor: Act as the primary technical point of contact for business stakeholders and clients. Translate complex technical challenges into clear business impacts. • Requirement Translation: Convert high-level business goals into detailed technical specifications for the engineering team. • Crisis Management: Manage expectations and navigate scope changes with senior management when challenges arise. • Technical Governance & Mentorship: Architecture Review: Approve the data models and transformation logic (dbt) designed by the engineering team. • Quality Assurance: Define the strategy for "Data Parity" testing to ensure the new GCP system matches the legacy source system 100%. • Team Guidance: Mentor Senior and Medior Data Engineers, helping them solve blockers related to Terraform, CI/CD, or SQL optimization.

🎯 Requirements

• Experience: 8+ years in Data Engineering or Architecture, with at least 3 years in a Lead/Architect role facing customers and internal stakeholders. • Migration Pro: Proven experience leading large-scale data platform migrations (e.g., On-prem to Cloud, or Cloud-to-Cloud). • Technical Competence: Target Expertise (GCP): Deep architectural knowledge of the GCP Data Stack (BigQuery is a must). • Source Awareness (Spark/Databricks): Ability to read and understand legacy PySpark or Databricks architectures to guide the refactoring process (Hands-on coding is a plus, but conceptual mastery is required). • Modern Stack: Familiarity with modern transformation tools like dbt and infrastructure-as-code (Terraform). • Advanced SQL (Required): You must be comfortable writing complex queries to validate data parity, audit data models, and debug performance bottlenecks in BigQuery. • Python Literacy: Ability to read and interpret legacy PySpark or Python ETL scripts to extract business logic. • Soft Skills: Communication: Exceptional ability to present technical strategies to non-technical audiences. • Negotiation: Ability to push back on unrealistic deadlines or requirements while maintaining strong relationships.

🏖️ Benefits

• Fully Remote: A 100% remote position with the flexibility to manage your own schedule. • Impactful Culture: Join a supportive team driven by values like teamwork and innovation, where your proactivity is always celebrated. • Commitment to Growth: We invest in your professional development through our "Continuous Learning" mindset, supporting both hands-on experience and certifications (like GCP Professional Cloud Architect). • Innovation in Action: AI and automation aren't just buzzwords here. They are essential tools we expect everyone to use to amplify their impact. • Autonomy & Accountability: Enjoy the flexibility and trust to influence your work, with full accountability for delivering results.

Apply Now

Similar Jobs

October 10

P2P Labs & P2P Tech Services

11 - 50

₿ Crypto

đź’¸ Finance

🤝 B2B

Data Engineer at P2P.org developing and maintaining ETL pipelines for blockchain data and optimizing ClickHouse analytics. Collaborating with the team to deliver reliable data services.

🇪🇺 Europe – Remote

⏳ Contract/Temporary

🟡 Mid-level

đźź  Senior

đźš° Data Engineer

March 29

Cloudacio

11 - 50

🤖 Artificial Intelligence

🤝 B2B

Join Cloudacio as a Data Engineer to develop data pipelines and big data solutions remotely.

🇪🇺 Europe – Remote

⏳ Contract/Temporary

🟡 Mid-level

đźź  Senior

đźš° Data Engineer

Developed by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com