Enterprise Data Architect – AWS, Databricks

Job not on LinkedIn

November 21

Apply Now
Logo of Aptus Data Labs

Aptus Data Labs

Artificial Intelligence • SaaS • B2B

Aptus Data Labs is a data engineering and enterprise AI company that builds scalable AI platforms, generative intelligence solutions, and data modernization services for large organizations. The company delivers industry-focused AI and analytics products (including aptplan, aptGenAI and other platforms) and services—covering advisory, cloud migration, MLOps/LLMOps, AI governance, and on-demand talent—to help pharmaceutical, banking, manufacturing, retail and other enterprises accelerate decision-making, compliance, and operational efficiency. Aptus partners with cloud and AI providers, offers pre-built accelerators and IP, and focuses on B2B deployments and enterprise-scale SaaS solutions.

201 - 500 employees

Founded 2014

🤖 Artificial Intelligence

☁️ SaaS

🤝 B2B

📋 Description

• Lead the enterprise data architecture strategy • Architect and implement data lakehouse solutions using Databricks on AWS • Design end-to-end data pipelines, integration frameworks, and governance models • Define data models, metadata management, and data quality frameworks • Collaborate with data engineering, AI/ML, analytics, and business teams • Evaluate and integrate emerging technologies in data mesh and automation frameworks • Provide technical leadership and mentorship to data engineering and architecture teams • Establish best practices for data security, lineage, compliance, and cloud cost optimization • Partner with business stakeholders to define data modernization roadmaps and cloud migration strategies

🎯 Requirements

• 18+ years of experience • Strong experience in Data Architecture, Data Engineering, or related domains • Proven experience architecting enterprise-scale data platforms using AWS • Hands-on expertise in Databricks (Delta Lake, Spark, Unity Catalog, MLflow) • Strong experience with data modeling and ETL/ELT pipelines • Deep understanding of data governance, master data management, and data cataloging tools • Proficient in SQL, Python, PySpark, and API-based data integration • Experience with modern data stack (Snowflake, dbt, Airflow, Kafka, etc.) is a plus • Strong understanding of AI/ML data readiness and metadata design • Excellent communication and leadership skills to collaborate with technical and business teams • Certifications in AWS or Databricks preferred

🏖️ Benefits

• Health insurance • Paid time off • Flexible work hours • Professional development opportunities

Apply Now

Similar Jobs

November 20

Weekday (YC W21)

11 - 50

☁️ SaaS

🎯 Recruiter

Big Data Engineer for Weekday's client focusing on real-time streaming systems and large-scale data processing. Building high-performance, low-latency pipelines using Java and modern big data technologies.

🇮🇳 India – Remote

💵 ₹600k - ₹1.9M / year

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

November 18

Smart Working

51 - 200

🤝 B2B

☁️ SaaS

🎯 Recruiter

Senior Data Engineer building scalable data solutions using Azure and SQL for global teams at Smart Working. Engage with clients and deliver measurable improvements across projects.

🇮🇳 India – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

November 18

WIN Home Inspection

201 - 500

🏠 Real Estate

Senior Data Engineer at WIN Home Inspection transforming real estate industry through data insights. Collaborating with teams to design BI dashboards and analyze data trends.

🇮🇳 India – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

November 18

Netomi

51 - 200

🤖 Artificial Intelligence

🏢 Enterprise

☁️ SaaS

Senior Data Engineer using data engineering techniques to create scalable data solutions for enterprise customer experience at Netomi.

🇮🇳 India – Remote

💰 $30M Series B on 2021-11

⏰ Full Time

🟠 Senior

🚰 Data Engineer

Developed by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com