Senior Data Engineer

Job not on LinkedIn

June 10

Apply Now
Logo of GBH

GBH

B2B • SaaS • Cybersecurity

GBH is a trusted provider of board-level virtual Chief Technology Officer (vCTO) and virtual Chief Information Officer (vCIO) services, helping businesses create and execute impactful growth strategies. Their team includes talented UX researchers and designers who specialize in transforming ideas into world-class digital products. GBH also offers engineering resources and IT managed services, providing essential IT support, cybersecurity, and business continuity solutions. With over 18 years of experience and a commitment to sustainable business growth, GBH partners with companies to navigate complex challenges and achieve strategic goals effectively.

51 - 200 employees

Founded 2005

🤝 B2B

☁️ SaaS

🔒 Cybersecurity

📋 Description

• This is a remote position. • At GBH, we don’t just do tech—we live it, breathe it, and build it with purpose. • We’re the dreamers, the builders, the strategists who turn ideas into digital experiences that actually matter. • Whether it’s crafting seamless mobile and web apps, unlocking insights through big data, or rethinking tech strategies, we do it all with impact in mind and belonging at heart. • We’re Geared for Impact. Built for Belonging. And always ready for what’s next. • GBH is seeking an experienced Senior Data Engineer to design, develop, and maintain data pipelines and infrastructure. • The ideal candidate will work with modern data engineering tools and practices, including Apache Airflow, Elasticsearch, PostgreSQL, and Oracle, ensuring high data integrity, availability, and performance. • This role involves processing and transforming data from multiple sources into structured, scalable pipelines for downstream analytics and reporting tools such as Apache Superset. • The Senior Data Engineer will be a key contributor to the design and implementation of robust ETL/ELT processes and will work closely with software developers, analysts, and DevOps engineers in an agile development environment.

🎯 Requirements

• Bachelor’s degree in Computer Science, Data Engineering, or related technical field. • 5 years of experience in data engineering, working on large-scale ETL/ELT systems. • 3 years of professional experience with Apache Airflow, building and maintaining DAGs. • 3 years of experience with Elasticsearch, including index design and optimization. • Strong experience with Python for data pipeline development and scripting. • Experience with PostgreSQL and Oracle databases, including SQL optimization and data modeling. • Experience working in UNIX/Linux environments. • Solid understanding of CI/CD pipelines using GitLab. • Familiarity with DevOps practices, containerization (e.g. Docker), and optionally Kubernetes. • Experience working in Agile software development teams. • Proficiency with Jira for project and issue tracking. • Excellent communication skills in English (verbal and written).

🏖️ Benefits

• Our Culture: A friendly, fast-paced and inclusive environment. • We rely on an open and empathetic culture that constantly promotes the growth of our team. • Learning & Development: We do our best to set the best baselines to accelerate your career. • Benefits & Rewards: We strive to offer competitive, unbiased, and fair rewards for all our people. • We empower you to manage your own time and promote flexible working opportunities, along with family-friendly policies.

Apply Now

Similar Jobs

June 9

TetraScience seeks a Scientific Data Architect to design solutions for scientific data workflows. Join a leader in scientific data and AI cloud industry.

AWS

Azure

Cloud

Distributed Systems

Google Cloud Platform

Python

Tableau

June 3

Involve in project lifecycle as a Data Engineer, managing soil carbon dataset for robust systems.

Amazon Redshift

AWS

MySQL

Python

SQL

Tableau

May 31

Develop and deploy data solutions for improving efficiency in DoD and Navy IT projects.

AWS

Cloud

Cyber Security

ETL

PySpark

Python

ServiceNow

SQL

Terraform

May 30

Senior Data Engineer role at Elder Research for designing data pipelines supporting data science models.

AWS

Cloud

JavaScript

Python

May 28

Join Koger, LLC as a Remote Data Engineer to develop scalable data infrastructure for healthcare and media.

Airflow

AWS

Azure

Cloud

Google Cloud Platform

Java

Kafka

Node.js

Python

Spark

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com