Data Architect

Job not on LinkedIn

September 24

Apply Now
Logo of InOrg Global

InOrg Global

Enterprise • SaaS • B2B

InOrg Global is a company specializing in managed services and innovative business strategies. They focus on providing AI/ML services, strategic operations, and workspace management to enhance productivity and efficiency. InOrg offers a unique Build-Operate-Transfer (BOT) model, guiding businesses through design, development, operation, and seamless transition of control to ensure sustained success. The company is dedicated to empowering digital disruptors and optimizing business processes with expertise in global capability centers, helping clients expand their global reach and achieve operational excellence.

📋 Description

• Design, develop, and maintain large-scale data systems • Develop and implement ETL processes using various tools and technologies • Collaborate with cross-functional teams to design and implement data models • Work with big data tools like Hadoop, Spark, PySpark, and Kafka • Develop scalable and efficient data pipelines • Troubleshoot data-related issues and optimize data systems • Transition and upskill into Databricks & AI/ML projects

🎯 Requirements

• Relevent experience in data engineering • Strong proficiency in Python, SQL, ETL, and data modeling • Experience with one or more of the following: Teradata, Informatica, Hadoop, Spark, PySpark, ADF, Snowflake, Big Data, Scala, Kafka • Cloud knowledge (AWS, Azure, or GCP) is a plus • Willingness to learn and adapt to new technologies, specifically Databricks & AI/ML • Nice to have: Experience with Databricks • Nice to have: Knowledge of AI/ML concepts and tools • Nice to have: Certification in relevant technologies

🏖️ Benefits

• Competitive salary and benefits • Opportunity to work on cutting-edge projects • Collaborative and dynamic work environment • Professional growth and development opportunities • Remote work opportunities & flexible hours

Apply Now

Similar Jobs

September 19

Senior Data Engineer building and scaling analytics and performance marketing pipelines for Forbes Advisor. Designing pipelines and data models across GCP and AWS, collaborating globally.

Airflow

AWS

BigQuery

Google Cloud Platform

Kafka

Python

Spark

SQL

Tableau

September 4

Senior Data Architect designing and implementing AWS cloud-native data platforms for Alight. Enabling analytics, AI, governance, and migration of big data workloads.

Amazon Redshift

AWS

Cloud

ETL

Hadoop

Java

Kafka

Oracle

Python

Scala

Spark

SQL

Terraform

August 28

Data Engineer at Saaf Finance builds AI-driven data infrastructure for mortgage origination; remote role with fast-paced startup culture.

AWS

BigQuery

Cloud

ETL

Python

SQL

Vault

August 27

Senior Data Engineer building and operating scalable ELT pipelines for Liven, a hospitality technology provider. Driving data reliability and analytics across global venues.

Airflow

Apache

Cloud

Kafka

Python

SQL

Terraform

August 25

Big Data Engineer designing and optimizing terabyte-scale pipelines for BrightEdge's SEO platform.

Airflow

Distributed Systems

Docker

ETL

Hadoop

Kubernetes

Microservices

NoSQL

Python

Spark

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com