Senior Director - Cloud Solutions Architect

May 8

Apply Now
Logo of Acxiom

Acxiom

Acxiom is a leading data service provider that specializes in leveraging data management, identity resolution, analytics, and martech services to help businesses enhance their marketing efforts. With a commitment to the ethical use of data, Acxiom offers solutions across various industries such as automotive, healthcare, telecommunications, financial services, and retail. Their expertise aids in optimizing marketing strategies to acquire, retain, and grow customer relationships. Acxiom partners with renowned platforms such as Adobe, Salesforce, and Snowflake to drive data-driven marketing and enable digital transformation for their clients.

Marketing services • Direct Marketing Agency • Digital agency • Information Technology • Marketing Technology

1001 - 5000 employees

Founded 1968

🤝 B2B

🛍️ eCommerce

📋 Description

• Architect and Design Scalable Databricks Solutions for enterprise data processing, analytics, machine learning, and business intelligence workloads. • Lead complex data modernization initiatives, including assessments, roadmap creation, and execution of legacy-to-cloud migrations (on-prem Hadoop, EDWs, etc.). • Define end-to-end architecture for ETL/ELT pipelines, data lakes, lakehouses, and real-time streaming platforms using Delta Lake and Databricks-native tools. • Partner with client and internal teams to define data architecture patterns, security models (Unity Catalog, row/column-level access), and data governance standards. • Drive presales efforts by shaping solution strategies, crafting compelling client proposals, contributing to RFP responses, authoring statements of work (SOWs), and developing tailored demos that showcase technical capabilities and business value • Implement CI/CD practices for Databricks notebooks/jobs using DevOps principles and infrastructure-as-code (e.g., Terraform, GitHub Actions, Azure DevOps). • Develop and publish reference architectures, reusable frameworks, and accelerators to enable consistent platform adoption across teams. • Mentor engineering teams in best practices for data engineering, lakehouse design, performance tuning, and workload optimization in Databricks. • Stay current with evolving Databricks platform capabilities and contribute to internal Centers of Excellence and capability building. • Design and implement data governance, access control, secure data sharing strategies & data clean rooms using Unity Catalog and Delta Sharing to enable compliant, cross-platform collaboration across partners.

🎯 Requirements

• Bachelor’s or Master’s in Computer Science, Engineering, Information Systems, or a related field. • 15+ years of data architecture experience, with at least 5+ years on the Databricks platform. • Strong expertise in Apache Spark, Delta Lake, Databricks SQL, Unity Catalog, and MLFlow. • Demonstrated experience in data migration projects (e.g., Teradata, Hadoop, Oracle to Databricks). • Proficient in Python, SQL, and Spark (PySpark), and hands-on experience with REST APIs and automation. • Solid understanding of modern data architectures (data lakehouse, streaming-first pipelines, real-time analytics). • Experience integrating Databricks with AWS/GCP/Azure, Snowflake, and enterprise tools (Informatica, dbt, Airflow, etc.). • Experience with security and compliance controls on Databricks: encryption, auditing, access controls. • Familiarity with cost optimization and performance tuning best practices in Databricks environments.

Apply Now

Similar Jobs

May 7

Serve as trusted advisor to customers, optimizing Zuora solutions for long-term growth and success.

May 7

Lead technical aspects of sales and client implementation at Bolt. Engage with clients in ecommerce, payments, and fintech.

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com