
SaaS • Artificial Intelligence • Enterprise
Particle41 is a company that specializes in providing expertise and solutions in technology development, data science, and DevOps. They offer CTO advisory services, acting as a partner to strengthen tech strategies and deliver robust software solutions tailored to their clients' needs. Particle41 emphasizes modernizing operations using cloud architecture, integrating software systems, and leveraging artificial intelligence to provide innovative digital products. They work closely with businesses to ensure on-time project delivery, scalability, and maintaining data security. Particle41 is particularly focused on helping businesses improve their competitive edge through strategic tech solutions and ongoing support.
October 28

SaaS • Artificial Intelligence • Enterprise
Particle41 is a company that specializes in providing expertise and solutions in technology development, data science, and DevOps. They offer CTO advisory services, acting as a partner to strengthen tech strategies and deliver robust software solutions tailored to their clients' needs. Particle41 emphasizes modernizing operations using cloud architecture, integrating software systems, and leveraging artificial intelligence to provide innovative digital products. They work closely with businesses to ensure on-time project delivery, scalability, and maintaining data security. Particle41 is particularly focused on helping businesses improve their competitive edge through strategic tech solutions and ongoing support.
• Design, develop, and maintain scalable ETL (Extract, Transform, Load) pipelines to process large volumes of data from diverse sources. • Build and optimize data storage solutions, such as data lakes and data warehouses, to ensure efficient data retrieval and processing. • Integrate structured and unstructured data from various internal and external systems to create a unified view for analysis. • Ensure data accuracy, consistency, and completeness through rigorous validation, cleansing, and transformation processes. • Maintain comprehensive documentation for data processes, tools, and systems while promoting best practices for efficient workflows. • Collaborate with product managers, and other stakeholders to gather requirements and translate them into technical solutions. • Participate in requirement analysis sessions to understand business needs and user requirements. • Provide technical insights and recommendations during the requirements-gathering process. • Participate in Agile development processes, including sprint planning, daily stand-ups, and sprint reviews. • Work closely with Agile teams to deliver software solutions on time and within scope. • Adapt to changing priorities and requirements in a fast-paced Agile environment. • Conduct thorough testing and debugging to ensure the reliability, security, and performance of applications. • Write unit tests and validate the functionality of developed features and individual elements. • Identify and resolve software defects, code smells, and performance bottlenecks. • Stay updated with the latest technologies and trends in full-stack development. • Propose innovative solutions to improve the performance, security, scalability, and maintainability of applications. • Collaborate effectively with cross-functional teams, including testers, and product managers.
• Bachelor's degree in Computer Science, Engineering, or related field. • Proven experience as a Data Engineering, with over of 3 years of experience. • Proficiency in Python programming language. • Experience with database technologies such as SQL (e.g., MySQL, PostgreSQL) and NoSQL (e.g., MongoDB) databases. • Strong understanding of Programming Libraries/Frameworks and technologies such as Flask, API frameworks, datawarehousing/lakehouse, principles, database and ORM, data analysis databricks, panda's, Spark, Pyspark, Machine learning, OpenCV, scikit-learn. • Utilities & Tools: logging, requests, subprocess, regex, pytest • ELK stack, Redis, distributed task queues • Strong understanding of data warehousing/lakehousing principles and concurrent/parallel processing concepts. • Familiarity with at least one cloud data engineering stack (Azure, AWS, or GCP) and the ability to quickly learn and adapt to new ETL/ELT tools across various cloud providers. • Familiarity with version control systems like Git and collaborative development workflows. • Competence in working on Linux OS and creating shell scripts. • Solid understanding of software engineering principles, design patterns, and best practices. • Excellent problem-solving and analytical skills, with a keen attention to detail. • Effective communication skills, both written and verbal, and the ability to collaborate in a team environment. • Adaptability and willingness to learn new technologies and tools as needed.
• Health insurance • 401(k) matching • Flexible work hours • Professional development opportunities
Apply NowOctober 28
Absence Management Consultant providing consultation and support on absence management initiatives for Centro. Collaborating with sales executives to enhance client services and compliance.
October 28
Responsible for managing Medicare and Medicaid cost reports for OhioHealth, ensuring proper reimbursement. Act as expert in healthcare reimbursement matters and provide analyses for financial reporting.
October 28
1001 - 5000
Territory Management Consultant at EMC, overseeing agency partnerships and driving premium growth in assigned territories. Collaborating with internal partners to achieve financial and operational goals.
October 28
1001 - 5000
Territory Management Consultant overseeing agency partnerships and premium growth strategies at EMC. Building relationships and delivering training while driving business development across the Twin City Metro area.
October 28
Senior Claims Consultant managing all aspects of catastrophic medical professional liability claims for clients. Responsibilities include claims management, reporting, and industry tracking.