Data Engineer – Human Capital

Job not on LinkedIn

October 24

Apply Now
Logo of GaP Solutions

GaP Solutions

SaaS • Retail • Hardware

GaP Solutions is an Australian provider of integrated retail technology and equipment. The company develops and delivers cloud-native EM Cloud™ back-office and EM POS™ point-of-sale software alongside POS hardware (printers, servers, terminals), weighing and labelling systems, and commercial food and bakery equipment. GaP Solutions specialises in retail-focused solutions for grocery, bakery, butchery, liquor, fuel/forecourt and convenience retailers, offering machine-learning-enhanced inventory and date-check management, self-checkout, integration/APIs, installation, certification and 24/7 support. They sell and support these products directly to retail businesses (B2B) and emphasise in-house development and service.

51 - 200 employees

Founded 1992

☁️ SaaS

🛒 Retail

🔧 Hardware

📋 Description

• Serve as an advisor to our NASA OCHCO (Office of the Chief Human Capital Officer) customer while leading development in data engineering activities. • Develop, implement and maintain a data pipeline using the Databricks platform, including extracting datasets using APIs, manipulating common data formats (e.g., parquet, text files, JSON), and automating routine data refreshes. • Build AWS resources to support data pipelining, storage, and analysis efforts. • Research and recommend software, systems, and processes as needed. • Be an effective consultant, including gathering analytic requirements to provide clear and concise solutions. • Work in a fast-paced, solutions-oriented environment focused on client deliverables, analysis, and reporting. • Conduct extensive quality control and record keeping procedures to ensure the highest levels of data integrity.

🎯 Requirements

• Bachelor’s Degree (Master's preferred) in a STEM or social science related field including coursework in basic statistics and measurement is required • 3+ years of related data engineering or technical consulting experience • Experience developing and automating data pipelines using Databricks • Proficient in Python for object-oriented programming, automation, and data analysis; also experienced with computational tools such as R. • Experience with ETL processes, including ingesting data from raw sources (APIs, web scraping), transforming them into usable formats for analysis, and validating them with automated checks • Hands-on experience with GitHub for version control. • Familiarity with GitHub Projects for task management. • Experience building out common AWS infrastructure (e.g. EC2, S3, Lambda rules) • Working knowledge of databases and SQL; preferred qualifications include linking analytic and data visualization products to database connections • Excellent communication skills and proven ability to work with business customers and technical teams.

🏖️ Benefits

• None

Apply Now

Similar Jobs

October 24

CMIT

201 - 500

🚗 Transport

🤝 B2B

Data Engineer specializing in migrating Oracle databases to PostgreSQL for modernization initiatives. Collaborating on enterprise scale data systems within AWS environments.

🇺🇸 United States – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

October 24

Medefy Health

51 - 200

☁️ SaaS

👥 B2C

🛍️ eCommerce

Data Engineering Team Lead overseeing data capabilities at Medefy Health. Leading a team to build and optimize data infrastructure for employee benefits navigation.

🇺🇸 United States – Remote

💵 $155k - $185k / year

⏰ Full Time

🟠 Senior

🚰 Data Engineer

October 24

Greater Nevada Mortgage

11 - 50

🏦 Banking

💸 Finance

Data Engineer managing all data-related initiatives at Greater Nevada. Responsible for data architecture, pipelines, and ensuring data integrity in the organization.

🇺🇸 United States – Remote

💵 $105k - $150k / year

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

October 23

YepCode

11 - 50

☁️ SaaS

🤖 Artificial Intelligence

🔌 API

Data Engineer responsible for developing and maintaining data pipelines for API integrations. Collaborating with teams to ensure data quality and integration with cloud platforms.

🇺🇸 United States – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

October 23

MetaMask

51 - 200

₿ Crypto

🌐 Web 3

💳 Fintech

Senior Data Engineer at Consensys designing and building robust data pipelines for blockchain applications. Collaborating with teams to ensure data is reliable, accessible, and ready for analysis.

🇺🇸 United States – Remote

💵 $156k - $187k / year

⏰ Full Time

🟠 Senior

🚰 Data Engineer

Developed by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com