
Artificial Intelligence • Enterprise • SaaS
Netomi is a company that specializes in providing AI-powered customer experience solutions. Their platform, known as Agentic OS, is designed for enterprise-scale customer service and integrates seamlessly with existing business systems. Netomi's solutions leverage generative AI and large language models to automate over 80% of customer inquiries, enhance customer satisfaction, and reduce support costs. Netomi is trusted by global brands across various industries, offering secure, proactive, and predictive customer care through omnichannel support, including email, chat, messaging, SMS, social media, search, and voice. The company ensures compliance with stringent security standards and data protection regulations.
51 - 200 employees
🤖 Artificial Intelligence
🏢 Enterprise
☁️ SaaS
💰 $30M Series B on 2021-11
November 18
Airflow
Amazon Redshift
Apache
AWS
Azure
BigQuery
Cloud
Distributed Systems
Docker
ETL
Google Cloud Platform
Hadoop
Kafka
Kubernetes
Python
Spark
SQL

Artificial Intelligence • Enterprise • SaaS
Netomi is a company that specializes in providing AI-powered customer experience solutions. Their platform, known as Agentic OS, is designed for enterprise-scale customer service and integrates seamlessly with existing business systems. Netomi's solutions leverage generative AI and large language models to automate over 80% of customer inquiries, enhance customer satisfaction, and reduce support costs. Netomi is trusted by global brands across various industries, offering secure, proactive, and predictive customer care through omnichannel support, including email, chat, messaging, SMS, social media, search, and voice. The company ensures compliance with stringent security standards and data protection regulations.
51 - 200 employees
🤖 Artificial Intelligence
🏢 Enterprise
☁️ SaaS
💰 $30M Series B on 2021-11
• Architect and implement scalable, secure, and reliable data pipelines using modern data platforms (e.g., Spark, Databricks, Airflow, Snowflake, etc.). • Develop ETL/ELT processes to ingest data from various structured and unstructured sources. • Perform Exploratory Data Analysis (EDA) to uncover trends, validate data integrity, and derive insights that inform data product development and business decisions. • Collaborate closely with data scientists, analysts, and software engineers to design data models that support high-quality analytics and real-time insights. • Lead data infrastructure projects including management of data on cloud platforms (AWS/Azure), data lake/warehouse implementations, and data quality frameworks. • Ensure data governance, security, and compliance best practices are followed. • Monitor and optimize the performance of data systems, addressing any issues proactively. • Mentor junior data engineers and contribute to establishing best practices in data engineering standards, tooling, and development workflows. • Stay current with emerging technologies and trends in data engineering and recommend improvements as needed.
• Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. • 8+ years of hands-on experience in data engineering or backend software development roles. • Proficiency with Python, SQL, and at least one data pipeline orchestration tool (e.g., Apache Airflow, Luigi, Prefect). • Strong experience with cloud-based data platforms (e.g., AWS Redshift, GCP BigQuery, Snowflake, Databricks). • Deep understanding of data modeling, data warehousing, and distributed systems. • Experience with big data technologies such as Apache Spark, Kafka, Hadoop, etc. • Familiarity with DevOps practices (CI/CD, infrastructure as code, containerization with Docker/Kubernetes).
• Equal opportunity employer committed to diversity in the workplace
Apply NowNovember 14
11 - 50
Data Engineer specializing in data mapping, transformations, and integrations at IT consulting firm. Working with Python, SQL, and data-driven solutions.
November 13
Lead Data Engineer responsible for designing and implementing scalable data platforms in India. Collaborating with cross-functional teams and mentoring junior engineers.
November 11
Cloud Data Engineer developing scalable applications on GCP while collaborating with data engineering teams. Requires experience in SQL, PL/SQL, and cloud technologies for large datasets.
November 11
Data Architect designing, developing, and analyzing big data solutions for actionable insights. Leading projects on statistical modeling and database management with a focus on diverse data sources.
November 10
Full Stack Data Engineer role focusing on designing and optimizing data solutions as part of Codvo's team. Collaborating with various stakeholders to deliver analytics and machine learning capabilities.