Data Architect - Databricks

Job not on LinkedIn

July 15

Apply Now
Logo of InOrg Global

InOrg Global

Enterprise • SaaS • B2B

InOrg Global is a company specializing in managed services and innovative business strategies. They focus on providing AI/ML services, strategic operations, and workspace management to enhance productivity and efficiency. InOrg offers a unique Build-Operate-Transfer (BOT) model, guiding businesses through design, development, operation, and seamless transition of control to ensure sustained success. The company is dedicated to empowering digital disruptors and optimizing business processes with expertise in global capability centers, helping clients expand their global reach and achieve operational excellence.

📋 Description

• Databricks Solution Architecture: Design and implement scalable, secure, and efficient Databricks solutions that meet client requirements. • Data Engineering: Develop data pipelines, architect data lakes, and implement data warehousing solutions using Databricks. • Data Analytics: Collaborate with data scientists and analysts to develop and deploy machine learning models and analytics solutions on Databricks. • Performance Optimization: Optimize Databricks cluster performance, ensuring efficient resource utilization and cost-effectiveness. • Security and Governance: Implement Databricks security features, ensure data governance, and maintain compliance with industry regulations. • Client Engagement: Work closely with clients to understand their business requirements, provide technical guidance, and deliver high-quality Databricks solutions. • Thought Leadership: Stay up to date with the latest Databricks features, best practices, and industry trends, and share knowledge with the team.

🎯 Requirements

• Databricks Experience: 5+ years of experience working with Databricks, including platform architecture, data engineering, and data analytics. • Technical Skills: Proficiency in languages such as Python, Scala, or Java, and experience with Databricks APIs, Spark, and Delta Lake. • Data Engineering: Strong background in data engineering, including data warehousing, ETL, and data governance. • Leadership: Proven experience leading technical teams, mentoring junior engineers, and driving technical initiatives. • Communication: Excellent communication and interpersonal skills, with the ability to work effectively with clients and internal stakeholders. • Good to Have Certifications: Databricks Certified Professional or similar certifications. • Cloud Experience: Experience working with cloud platforms such as AWS, Azure, or GCP. • Machine Learning: Knowledge of machine learning concepts and experience with popular ML libraries

Apply Now

Similar Jobs

July 9

Data Engineer specializing in Fraud Detection and Financial Crime Analytics for financial services. Responsible for designing real-time data pipelines and optimizing fraud detection models.

AWS

Azure

Cloud

ETL

Google Cloud Platform

Kafka

Python

Spark

SQL

July 4

Join e.l.f. Beauty as a Senior Data Architect to lead data architecture initiatives in a high-growth team.

AWS

Azure

Cloud

Distributed Systems

ETL

Google Cloud Platform

Java

Kafka

Python

Scala

SQL

July 4

Join e.l.f. Beauty as a Data Engineer to design and maintain data pipelines and infrastructure.

🇮🇳 India – Remote

💵 ₹3.5M - ₹4.5M / year

💰 $225.2M Post-IPO Secondary on 2017-03

⏰ Full Time

🟠 Senior

🔴 Lead

🚰 Data Engineer

AWS

Azure

Cloud

ETL

Google Cloud Platform

Informatica

Java

Python

SQL

June 19

Senior Data Engineer developing AWS data lakes and ETL pipelines. Collaborating with cross-functional teams on data architecture and AI initiatives.

Airflow

Amazon Redshift

AWS

Cloud

ETL

Python

Spark

SQL

June 18

Lead Data Engineer for a growing team at Forbes Advisor, focusing on data engineering best practices.

Airflow

BigQuery

ETL

Google Cloud Platform

Kafka

Python

Spark

SQL

Tableau

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com