Data Engineer

Job not on LinkedIn

November 23

Apply Now
Logo of Abyss Solutions Ltd

Abyss Solutions Ltd

Abyss is pioneering the future of inspection at scale, providing products and solutions that are enabling autonomous robots to capture and analyze data at an unprecedented level. Its industry-leading technology is pushing the boundaries of the possible, going beyond the status quo to deliver billions of dollars in risk reduction for some of the world’s biggest companies.

51 - 200 employees

📋 Description

• Develop, manage, and optimise operational workflows to ensure efficient and streamlined processes across projects. • Use advanced Python and Bash scripting skills to build, troubleshoot, and enhance workflows. • Maintain high standards of quality in deliverables, implementing best practices to reduce errors and ensure compliance with internal SOPs. • Act as a primary point of contact for cross-functional teams, providing clear communication and updates on project progress, requirements, and potential risks. • Ensure that all projects are delivered on time, managing resources and setting realistic timelines to align with business objectives. • Continuously evaluate workflows, identifying areas for improvement, and implementing changes to enhance productivity and reduce operational bottlenecks. • Maintain thorough documentation for processes, workflows, and scripts, ensuring knowledge sharing and easy hand-off to team members as needed.

🎯 Requirements

• 6+ years of experience in an operations, with a track record of successfully managing workflows and processes. • Proficiency in Python for scripting and automation tasks. • Strong knowledge of Bash scripting. • Familiarity with pipeline workflow systems. • GCP/AWS knowledge (preferred but not mandatory). • Demonstrated experience managing multiple projects simultaneously, ensuring quality and timely delivery. • Excellent verbal and written communication skills to effectively liaise with wider teams and stakeholders. • Strong analytical skills with a proactive approach to identifying and resolving operational issues.

🏖️ Benefits

• Note: Please note that this position is only applicable to candidates based in Pakistan.

Apply Now

Similar Jobs

November 20

Data Engineer (Adobe AEP) building and maintaining ingestion pipelines for diverse data sources in a remote setting. Seeking innovative individuals to join a forward-thinking team at TechBiz Global.

AWS

Azure

Cloud

ETL

Google Cloud Platform

Kafka

Python

Scala

SQL

November 20

Data Architect (Adobe AEP) providing data architecture expertise for top clients. Designing ingestion pipelines and ensuring data governance in an innovative environment.

AWS

Azure

Cloud

Google Cloud Platform

November 18

Data Architect designing and governing the data ecosystem for a ValueX platform at TechBiz Global. Ensuring scalability and collaboration with various teams for data architecture alignment.

Airflow

Apache

AWS

Azure

BigQuery

Cloud

ETL

Google Cloud Platform

GraphQL

Kafka

PySpark

Python

Spark

SQL

October 14

Data Engineer responsible for designing, building, and maintaining reliable data pipelines and infrastructure. Collaborating with cross-functional teams to ensure data quality and accessibility for analytics.

Airflow

Amazon Redshift

BigQuery

ETL

Kafka

Python

Scala

SQL

September 29

Data Architect designing scalable data models, integration, and governance at Creative Chaos. Lead database design, performance optimization, and mentor junior engineers.

Apache

Cloud

Google Cloud Platform

MySQL

Oracle

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com