
Enterprise • SaaS • B2B
InOrg Global is a company specializing in managed services and innovative business strategies. They focus on providing AI/ML services, strategic operations, and workspace management to enhance productivity and efficiency. InOrg offers a unique Build-Operate-Transfer (BOT) model, guiding businesses through design, development, operation, and seamless transition of control to ensure sustained success. The company is dedicated to empowering digital disruptors and optimizing business processes with expertise in global capability centers, helping clients expand their global reach and achieve operational excellence.
51 - 200 employees
🏢 Enterprise
☁️ SaaS
🤝 B2B
September 24

Enterprise • SaaS • B2B
InOrg Global is a company specializing in managed services and innovative business strategies. They focus on providing AI/ML services, strategic operations, and workspace management to enhance productivity and efficiency. InOrg offers a unique Build-Operate-Transfer (BOT) model, guiding businesses through design, development, operation, and seamless transition of control to ensure sustained success. The company is dedicated to empowering digital disruptors and optimizing business processes with expertise in global capability centers, helping clients expand their global reach and achieve operational excellence.
51 - 200 employees
🏢 Enterprise
☁️ SaaS
🤝 B2B
• Design, develop, and maintain large-scale data systems • Develop and implement ETL processes using various tools and technologies • Collaborate with cross-functional teams to design and implement data models • Work with big data tools like Hadoop, Spark, PySpark, and Kafka • Develop scalable and efficient data pipelines • Troubleshoot data-related issues and optimize data systems • Transition and upskill into Databricks & AI/ML projects
• Relevent experience in data engineering • Strong proficiency in Python, SQL, ETL, and data modeling • Experience with one or more of the following: Teradata, Informatica, Hadoop, Spark, PySpark, ADF, Snowflake, Big Data, Scala, Kafka • Cloud knowledge (AWS, Azure, or GCP) is a plus • Willingness to learn and adapt to new technologies, specifically Databricks & AI/ML • Nice to have: Experience with Databricks • Nice to have: Knowledge of AI/ML concepts and tools • Nice to have: Certification in relevant technologies
• Competitive salary and benefits • Opportunity to work on cutting-edge projects • Collaborative and dynamic work environment • Professional growth and development opportunities • Remote work opportunities & flexible hours
Apply NowSeptember 19
Senior Data Engineer building and scaling analytics and performance marketing pipelines for Forbes Advisor. Designing pipelines and data models across GCP and AWS, collaborating globally.
September 4
Senior Data Architect designing and implementing AWS cloud-native data platforms for Alight. Enabling analytics, AI, governance, and migration of big data workloads.
August 28
Data Engineer at Saaf Finance builds AI-driven data infrastructure for mortgage origination; remote role with fast-paced startup culture.
August 27
Senior Data Engineer building and operating scalable ELT pipelines for Liven, a hospitality technology provider. Driving data reliability and analytics across global venues.
August 25
Big Data Engineer designing and optimizing terabyte-scale pipelines for BrightEdge's SEO platform.