Senior Data Engineer

Job not on LinkedIn

September 28

Apply Now
Logo of 3Pillar Global

3Pillar Global

SaaS • Enterprise • Artificial Intelligence

3Pillar Global is a modern application strategy, design, and engineering firm that specializes in delivering strategic software development initiatives for various industries. They offer a range of services, including application technology strategy, digital product engineering, data and analytics, and artificial intelligence development. 3Pillar Global focuses on helping organizations transform their bold ideas into breakthrough solutions by leveraging cutting-edge technologies such as generative and multimodal AI. They work with partners and clients across multiple sectors, including healthcare, financial services, insurance, media, and information services, to solve complex technology challenges and deliver high-performing results.

1001 - 5000 employees

☁️ SaaS

🏢 Enterprise

🤖 Artificial Intelligence

💰 Private Equity Round on 2021-10

📋 Description

• Build shippable software following engineering standards • Build and maintain key engineering blocks such as APIs and Big Data implementations • Support and extend current stack with new features • Work on ad-hoc R&D projects • Collaborate closely with client BI users, operations, and development teams to encourage data-driven approaches • Ensure deliveries are on time and of required quality • Maintain company data assets at required quality levels • Design and build solid, efficient, stable APIs • Maintain high code quality and enforce best practices • Keep up to date with latest technologies and methodologies • Ensure globally robust and highly scalable development approaches to support global users and services

🎯 Requirements

• Python development skills • Ability to implement ETL data pipelines in Python • Creating REST APIs • Advanced SQL scripting knowledge • Experience with Google Cloud Platform, AWS or Azure • 2+ years of experience in data or software development • Knowledge of big data platforms • Knowledge of relational databases • Knowledge of technologies: Git, Docker, Bash • Ability to propose, design and implement ETL solutions in batch and real-time • Understanding of continuous delivery pipelines and ability to design a process • Ability to select appropriate technologies for tasks • Experience with DBT (desirable) • Experience with Dataflow or Apache Beam (desirable) • Experience using Airflow (desirable) • Experience with NoSQL databases like Redis or Elastic Search (desirable) • Working English (daily)

🏖️ Benefits

• Flexible work environment (office, home, or blend) • Remote-first approach from interviews to onboarding • Part of a global team with cross-cultural collaboration and English as working language • Wellbeing-focused trimester with fitness offerings and mental health plans (country-dependent) • Generous time off • Career growth and development opportunities across projects, offerings, and industries • Equal opportunity employer and commitment to diversity

Apply Now

Similar Jobs

August 9

Trimble Inc.

10,000+ employees

Join Transporeon as a Data Engineer to implement scalable data solutions on AWS platforms.

Amazon Redshift

AWS

Azure

Cloud

Docker

DynamoDB

ETL

Kubernetes

Postgres

Python

SQL

Terraform

April 26

Join Tecknoworks as a Data Engineer. Develop robust data pipelines on AWS and Azure.

Airflow

Amazon Redshift

Apache

AWS

Azure

Cloud

Docker

ETL

Kafka

Kubernetes

Python

Spark

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com