
Digital Marketing • SaaS • AI
Axelerant is a digital experience agency that blends AI with human creativity to deliver transformational outcomes for their clients. They offer services in digital experience platforms, digital engineering, experience design, digital marketing, intelligent automation, and quality engineering. Axelerant focuses on driving value at the intersection of people and digital experiences through people transformation, learning and development, and people care. They have been involved in various projects, including digital transformations and the development of digital experience platforms for clients like Doctors Without Borders and OHCHR. org. Their approach emphasizes seamless and scalable solutions, aiming to create value-driven solutions aligned with their clients' vision.
51 - 200 employees
☁️ SaaS
15 hours ago

Digital Marketing • SaaS • AI
Axelerant is a digital experience agency that blends AI with human creativity to deliver transformational outcomes for their clients. They offer services in digital experience platforms, digital engineering, experience design, digital marketing, intelligent automation, and quality engineering. Axelerant focuses on driving value at the intersection of people and digital experiences through people transformation, learning and development, and people care. They have been involved in various projects, including digital transformations and the development of digital experience platforms for clients like Doctors Without Borders and OHCHR. org. Their approach emphasizes seamless and scalable solutions, aiming to create value-driven solutions aligned with their clients' vision.
51 - 200 employees
☁️ SaaS
• Design and own the end-to-end architecture of large-scale data platforms built on platforms like Snowflake and AWS. • Define data engineering standards, patterns, and best practices for ingestion, modeling, governance, and consumption. • Lead modernization efforts including data lakehouse design, migration planning, and platform consolidation. • Partner with customer stakeholders to translate business requirements into scalable technical solutions. • Architect optimized Snowflake environments covering data modeling, workload isolation, clustering, RLS/CLS, data sharing, and performance management. • Implement secure, cost-efficient, cloud-native pipelines using AWS services such as S3, Athena, Redshift, Glue, Lambda, Kinesis, IAM, Step Functions, and CloudWatch. • Leverage Snowflake capabilities including Snowpark, Streams and Tasks, Materialized Views, Data Marketplace, and Time Travel. • Oversee the development of high-volume batch and streaming pipelines using Spark, Kafka, Airflow, or similar technologies. • Implement robust ETL and ELT frameworks ensuring data accuracy, lineage, observability, and governance. • Analyze and optimize data workflows for performance, scalability, and cost efficiency. • Serve as the primary technical partner for customer teams, communicating architectural decisions and trade-offs. • Lead technical workshops, discovery sessions, and solution reviews with both technical and non-technical stakeholders. • Ensure alignment between customer goals and Axelerant’s delivery approach while influencing long-term data strategy. • Guide and mentor a team of data engineers, fostering a collaborative, growth-oriented environment. • Provide technical oversight across workstreams, ensuring quality, consistency, and engineering excellence. • Facilitate knowledge sharing, code reviews, design sessions, and continuous improvement practices. • Establish data governance frameworks including cataloging, lineage, access controls, quality checks, and lifecycle management. • Ensure compliance with enterprise security standards and regulatory policies across AWS and Snowflake workflows. • Promote DataOps principles including automation, CI/CD for pipelines, testing strategies, and monitoring. • Stay current with emerging tools, technologies, and patterns in data engineering and cloud architecture. • Recommend and introduce innovative solutions to enhance customer outcomes and internal accelerators. • Contribute to building reusable components, architectural templates, and best practice guides.
• Eight or more years of experience designing and delivering large-scale data solutions. • Deep expertise with Snowflake architecture including modeling, security, orchestration, and optimization. • Strong experience architecting data platforms on AWS with hands-on knowledge of core data services. • Proficiency in Python, Scala, or Java with experience building scalable data pipelines. • Strong understanding of distributed processing frameworks like Spark or Flink and streaming tools like Kafka or Kinesis. • Solid grounding in data modeling, warehouse and lakehouse design patterns, and query optimization. • Experience with orchestration tools such as Apache Airflow or Prefect. • Comfortable with containerization and orchestration using Docker and Kubernetes. • Strong understanding of data governance, privacy, and security best practices. • Excellent communication skills to work effectively with customers, engineering teams, and leadership. • Proven leadership experience mentoring teams and driving large technical initiatives.
• - Be part of an **AI-first, remote-first** digital agency that’s shaping the future of customer experiences. • - Collaborate with global teams and leading platform partners to solve meaningful challenges. • - Enjoy a culture that supports autonomy, continuous learning, and work-life harmony.
Apply Now4 days ago
DataOps Engineer managing data pipelines and analytics platforms at Nagarro, a Digital Product Engineering company. Ensuring data reliability, quality checks, and collaborating with infrastructure teams in a dynamic work environment.
Ansible
AWS
Azure
Cloud
Docker
ETL
Google Cloud Platform
Grafana
Jenkins
Kubernetes
Prometheus
Python
ServiceNow
Shell Scripting
SQL
Terraform
5 days ago
Principal Data Engineer leading data engineering strategy at health plan dedicated to transforming healthcare. Overseeing teams to optimize data solutions and insights using cutting-edge technologies.
AWS
November 21
Enterprise Data Architect at Aptus Data Labs leading data architecture strategy and digital transformation. Designing data platforms using AWS and Databricks for analytics and AI initiatives.
Airflow
AWS
Cloud
ETL
Kafka
PySpark
Python
Spark
SQL
Unity
November 11
Cloud Data Engineer developing scalable applications on GCP while collaborating with data engineering teams. Requires experience in SQL, PL/SQL, and cloud technologies for large datasets.
BigQuery
Cloud
Google Cloud Platform
NoSQL
SQL
November 11
Data Architect designing, developing, and analyzing big data solutions for actionable insights. Leading projects on statistical modeling and database management with a focus on diverse data sources.
Hadoop
Python
Spark
SQL
Tensorflow