
AI • Enterprise • SaaS
Accellor is a company offering AI-driven solutions across various industries, focusing on enhancing efficiency and engagement through advanced applications and data strategies. Their services include leveraging artificial intelligence for enterprise applications, product engineering, and cloud services to transform industries such as healthcare, manufacturing, financial services, real estate, retail, travel, and hospitality. Accellor partners with technology leaders like Salesforce and Microsoft Dynamics 365 to deliver personalized, intelligent business applications. Committed to responsible AI practices, Accellor helps organizations harness the potential of data and AI to drive strategic decisions, automate operations, and provide superior experiences.
201 - 500 employees
🏢 Enterprise
☁️ SaaS
November 26

AI • Enterprise • SaaS
Accellor is a company offering AI-driven solutions across various industries, focusing on enhancing efficiency and engagement through advanced applications and data strategies. Their services include leveraging artificial intelligence for enterprise applications, product engineering, and cloud services to transform industries such as healthcare, manufacturing, financial services, real estate, retail, travel, and hospitality. Accellor partners with technology leaders like Salesforce and Microsoft Dynamics 365 to deliver personalized, intelligent business applications. Committed to responsible AI practices, Accellor helps organizations harness the potential of data and AI to drive strategic decisions, automate operations, and provide superior experiences.
201 - 500 employees
🏢 Enterprise
☁️ SaaS
• Design and implement scalable data pipelines and ETL/ELT processes within Microsoft Fabric from a code-first approach • Develop and maintain notebooks, data pipelines, workspace and other Fabric item configurations • Build and optimize data architectures using delta tables, lakehouse, and data warehouse patterns • Implement data modelling solutions including star schema, snowflake schema, and slowly changing dimensions (SCDs) • Performance tune Delta, Spark, and SQL workloads through partitioning, optimization, liquid clustering, and other advanced techniques • Develop and deploy Fabric solutions using CI/CD practices via Azure DevOps • Integrate and orchestrate data workflows using Fabric Data Agents and REST APIs • Collaborate with development team and stakeholders to translate business requirements into technical solutions
• Microsoft Fabric Expertise: • Hands-on experience with Fabric notebooks, pipelines, and workspace configuration • Fabric Data Agent implementation and orchestration • Fabric CLI and CI/CD deployment practices • **Programming & Development:** • Python (advanced proficiency) • PySpark for distributed data processing • Pandas and Polars for data manipulation • Experience with Python libraries for data engineering workflows • REST API development and integration • **Data Platform & Storage:** • Delta Lake and Iceberg table formats • Delta table optimization techniques (partitioning, Z-ordering, liquid clustering) • Spark performance tuning and optimization • SQL query optimization and performance tuning • **Development Environment:** • Visual Studio Code • Azure DevOps for CI/CD and deployment pipelines • Experience with both code-first and low-code development approaches • **Data Modeling:** • Data warehouse dimensional modeling (star schema, snowflake schema) • Slowly Changing Dimensions (SCD Type 1, 2, 3) • Modern lakehouse architecture patterns • Metadata driven approaches • **Preferred Qualifications:** • 5+ years of data engineering experience • Previous experience with large-scale data platforms and enterprise analytics solutions • Strong understanding of data governance and security best practices • Experience with Agile/Scrum methodologies • Excellent problem-solving and communication skills
Apply NowNovember 11
Data Architect designing and implementing cloud-based data architecture at CompassX. Supporting analytics and Customer 360 experiences with a focus on Snowflake and dbt.
November 1
201 - 500
SAP Data Migration Architect specializing in S/4HANA implementations. Focused on data migration strategies, advisory services, and cutover planning in SAP environments.
ERP
October 31
Sr. Manager Data Engineering leading data integration and warehouse projects remotely. Focused on developing and configuring data pipelines with a solid background in GCP and Informatica.
October 31
Sr. Data Architect designing data architecture solutions using Neo4j and AWS services. Collaborating with engineering teams to deliver scalable data systems and mentoring junior engineers.
October 31
Data Architect focused on creating data architecture solutions for business requirements. Responsible for mentoring junior engineers and ensuring data integrity in a remote work environment.