Data Architect - Azure

Job not on LinkedIn

August 22

Apply Now
Logo of Allata

Allata

B2B • Consulting • Technology

Allata is a global consulting firm that helps businesses achieve digital excellence by crafting unique customer experiences, identifying revenue-generating opportunities, and improving operational efficiencies. They offer a wide range of services including strategic services, technology and cloud solutions, data and insights, and artificial intelligence development. Allata's approach is experience-led, strategy-aligned, tech-enabled, and data-driven. They work across multiple industries such as automotive, financial services, health and life sciences, high tech, logistics and transportation, among others, to drive modernization, personalization, innovation, and efficiency. With a team of former IT leaders and consulting professionals, Allata guides clients through complex digital landscapes to deliver value-based outcomes and help organizations integrate digital strategies seamlessly into their overall business strategies.

201 - 500 employees

🤝 B2B

📋 Description

• Lead data architecture transformations and/or migration to modern cloud solutions, ensuring robustness and scalability of the soltuion. • Design and implement Virtual Cloud solutions including virtual network and management of network routings and subnets. • Design efficient, scalable ETL/ELT pipelines that streamline data processing and integration. Experience with integrating data across disparate systems using APIs and data integration tools • Design scalable, high-performance data models and databases, including Medallion Data Warehouse architecture, Data Lakehouses, dimensión modelling (Star and/or Snowflake schemas), and other relevant design patterns. • Manage one or more Data projects, owning the end-to-end execution and owning responsibilty of the outcomes • Lead cross-functional teams to deliver on client goals, providing technical guidance and ensuring alignment with best practices. • Engage with clients and stakeholders to understand business goals, provide technical expertise, and deliver tailored solutions. • Ensure all designed solutions meet the highest standards of quality, performance cost-effectiveness, and access policy management. • Stay updated with emerging industry trends and technologies, integrating them into solutions to provide added value to clients.

🎯 Requirements

• Lead data architecture transformations and/or migration to modern cloud solutions, ensuring robustness and scalability of the soltuion. • Design and implement Virtual Cloud solutions including virtual network and management of network routings and subnets. • Design efficient, scalable ETL/ELT pipelines that streamline data processing and integration. Experience with integrating data across disparate systems using APIs and data integration tools • Design scalable, high-performance data models and databases, including Medallion Data Warehouse architecture, Data Lakehouses, dimensión modelling (Star and/or Snowflake schemas), and other relevant design patterns. • Manage one or more Data projects, owning the end-to-end execution and owning responsibilty of the outcomes • Lead cross-functional teams to deliver on client goals, providing technical guidance and ensuring alignment with best practices. • Engage with clients and stakeholders to understand business goals, provide technical expertise, and deliver tailored solutions. • Ensure all designed solutions meet the highest standards of quality, performance cost-effectiveness, and access policy management. • Stay updated with emerging industry trends and technologies, integrating them into solutions to provide added value to clients.

Apply Now

Similar Jobs

July 28

Data Engineer position for structuring and analyzing strategic data to drive data-driven decisions.

🗣️🇪🇸 Spanish Required

Airflow

Apache

AWS

Azure

Cassandra

Cloud

Google Cloud Platform

IoT

MongoDB

MySQL

NoSQL

Numpy

Pandas

Postgres

Python

Scala

Spark

SQL

Tableau

June 9

Join dLocal as a Senior Data Engineer to enhance their data platform and governance. Work remotely in a flexible, dynamic culture.

Airflow

Apache

Cloud

Google Cloud Platform

Python

Spark

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com