
Enterprise • SaaS • B2B
InOrg Global is a company specializing in managed services and innovative business strategies. They focus on providing AI/ML services, strategic operations, and workspace management to enhance productivity and efficiency. InOrg offers a unique Build-Operate-Transfer (BOT) model, guiding businesses through design, development, operation, and seamless transition of control to ensure sustained success. The company is dedicated to empowering digital disruptors and optimizing business processes with expertise in global capability centers, helping clients expand their global reach and achieve operational excellence.
51 - 200 employees
🏢 Enterprise
☁️ SaaS
🤝 B2B
August 11

Enterprise • SaaS • B2B
InOrg Global is a company specializing in managed services and innovative business strategies. They focus on providing AI/ML services, strategic operations, and workspace management to enhance productivity and efficiency. InOrg offers a unique Build-Operate-Transfer (BOT) model, guiding businesses through design, development, operation, and seamless transition of control to ensure sustained success. The company is dedicated to empowering digital disruptors and optimizing business processes with expertise in global capability centers, helping clients expand their global reach and achieve operational excellence.
51 - 200 employees
🏢 Enterprise
☁️ SaaS
🤝 B2B
• Design, build, and maintain scalable data pipelines using Databricks and Apache Spark. • Integrate data from various sources into data lakes or data warehouses. • Implement and manage Delta Lake architecture for reliable, versioned data storage. • Ensure data quality, performance, and reliability through testing and monitoring. • Collaborate with data analysts, scientists, and stakeholders to meet data needs. • Automate workflows and manage job scheduling within Databricks. • Maintain clear and thorough documentation of data workflows and architecture.
• 3+ years in data engineering with strong exposure to Databricks and big data tools. • Technical Skills: Proficient in Python or Scala for ETL development. • Strong understanding of Spark, Delta Lake, and Databricks SQL. • Familiar with REST APIs, including Databricks REST API usage. • Cloud Platform: Experience with AWS, Azure, or GCP. • Data Modeling: Familiarity with data lakehouse concepts and dimensional modeling. • Version Control & CI/CD: Comfortable using Git and pipeline automation tools. • Soft Skills: Strong problem-solving abilities, attention to detail, and teamwork. • Nice to Have Certifications: Databricks Certified Data Engineer Associate/Professional. • Workflow Tools: Experience with Airflow or Databricks Workflows. • Monitoring: Familiarity with Datadog, Prometheus, or similar tools. • ML Pipelines: Exposure to MLflow or model integration in pipelines.
Apply NowAugust 9
Lead the architecture planning and design for data solutions in a remote role.
August 8
Join Teamified as a Senior AI Data Engineer to design and develop AI solutions.
August 8
Join Teamified as a Senior AI Data Engineer to develop cutting-edge AI solutions for our products.
July 31
Lead projects for designing and maintaining a data analytics platform in Cummins, engaging stakeholders.
July 29
Data Engineer role at Amgen focusing on developing data pipelines for clinical data analytics and innovation.