
SaaS • Artificial Intelligence • Enterprise
Particle41 is a company that specializes in providing expertise and solutions in technology development, data science, and DevOps. They offer CTO advisory services, acting as a partner to strengthen tech strategies and deliver robust software solutions tailored to their clients' needs. Particle41 emphasizes modernizing operations using cloud architecture, integrating software systems, and leveraging artificial intelligence to provide innovative digital products. They work closely with businesses to ensure on-time project delivery, scalability, and maintaining data security. Particle41 is particularly focused on helping businesses improve their competitive edge through strategic tech solutions and ongoing support.
51 - 200 employees
☁️ SaaS
🤖 Artificial Intelligence
🏢 Enterprise
3 days ago
AWS
Azure
Cloud
ETL
Flask
Google Cloud Platform
Linux
MongoDB
MySQL
NoSQL
Postgres
PySpark
Python
Scikit-Learn
Spark
SQL

SaaS • Artificial Intelligence • Enterprise
Particle41 is a company that specializes in providing expertise and solutions in technology development, data science, and DevOps. They offer CTO advisory services, acting as a partner to strengthen tech strategies and deliver robust software solutions tailored to their clients' needs. Particle41 emphasizes modernizing operations using cloud architecture, integrating software systems, and leveraging artificial intelligence to provide innovative digital products. They work closely with businesses to ensure on-time project delivery, scalability, and maintaining data security. Particle41 is particularly focused on helping businesses improve their competitive edge through strategic tech solutions and ongoing support.
51 - 200 employees
☁️ SaaS
🤖 Artificial Intelligence
🏢 Enterprise
• Design, develop, and maintain scalable ETL (Extract, Transform, Load) pipelines to process large volumes of data from diverse sources. • Build and optimize data storage solutions, such as data lakes and data warehouses, to ensure efficient data retrieval and processing. • Integrate structured and unstructured data from various internal and external systems to create a unified view for analysis. • Ensure data accuracy, consistency, and completeness through rigorous validation, cleansing, and transformation processes. • Maintain comprehensive documentation for data processes, tools, and systems while promoting best practices for efficient workflows. • Collaborate with product managers, and other stakeholders to gather requirements and translate them into technical solutions. • Participate in Agile development processes, including sprint planning, daily stand-ups, and sprint reviews. • Conduct thorough testing and debugging to ensure the reliability, security, and performance of applications.
• Bachelor's degree in Computer Science, Engineering, or related field. • Proven experience as a Data Engineer, with a minimum of 3 years of experience. • Proficiency in Python programming language. • Experience with database technologies such as SQL (e.g., MySQL, PostgreSQL) and NoSQL (e.g., MongoDB) databases. • Strong understanding of Programming Libraries/Frameworks and technologies such as Flask, API frameworks, datawarehousing/lakehouse, principles, database and ORM, data analysis databricks, panda's, Spark, Pyspark, Machine learning, OpenCV, scikit-learn. • Familiarity with at least one cloud data engineering stack (Azure, AWS, or GCP) and the ability to quickly learn and adapt to new ETL/ELT tools across various cloud providers. • Familiarity with version control systems like Git and collaborative development workflows. • Competence in working on Linux OS and creating shell scripts. • Solid understanding of software engineering principles, design patterns, and best practices. • Excellent problem-solving and analytical skills, with a keen attention to detail. • Effective communication skills, both written and verbal, and the ability to collaborate in a team environment.
• Health insurance • Professional development opportunities
Apply Now3 days ago
Senior Data Engineer designing and optimizing data workflows for Sezzle's analytics and operational needs. Collaborating with teams to enhance data quality and infrastructure.
3 days ago
Data Engineer Senior developing data ingestion and orchestration solutions at GIGA IT. Utilizing modern technologies for data processing and governance.
November 8
Data Engineer role at Influur focused on AI-driven influencer marketing technology. Building and optimizing data systems with strong programming and data modeling skills.
October 9
Data Engineer developing data integration solutions with IBM DataStage and SQL. Analyzing data and providing production support for business needs.
September 14
Snowflake Data Engineer designing scalable data solutions at consulting firm Allata. Build ETL/ELT pipelines, ensure data quality, and implement CI/CD for data workflows.