Senior Data Engineer

Job not on LinkedIn

September 28

Apply Now
Logo of 3Pillar Global

3Pillar Global

SaaS • Enterprise • Artificial Intelligence

3Pillar Global is a modern application strategy, design, and engineering firm that specializes in delivering strategic software development initiatives for various industries. They offer a range of services, including application technology strategy, digital product engineering, data and analytics, and artificial intelligence development. 3Pillar Global focuses on helping organizations transform their bold ideas into breakthrough solutions by leveraging cutting-edge technologies such as generative and multimodal AI. They work with partners and clients across multiple sectors, including healthcare, financial services, insurance, media, and information services, to solve complex technology challenges and deliver high-performing results.

1001 - 5000 employees

☁️ SaaS

🏢 Enterprise

🤖 Artificial Intelligence

💰 Private Equity Round on 2021-10

📋 Description

• Build shippable software following Engineering standards in place • Build and maintain key Engineering blocks that other teams can rely upon (APIs, Big Data implementations) • Support the current stack and extend it with new features • Work on ad-hoc R&D projects • Work closely with client business intelligence users, operations and development teams, encouraging a data-driven and pragmatic approach • Ensure deliveries are on time and of required quality • Maintain the company’s data assets at required quality levels • Help design and build solid, efficient, stable APIs • Help maintain high standard of code • Keep up to date with latest technologies and methodologies • Ensure globally robust and highly scalable development approaches • Enforce best practices in code quality and process design

🎯 Requirements

• Python development skills • Ability to implement ETL data pipelines in Python • Creating REST APIs • Advanced SQL scripting knowledge • Experience with Google Cloud Platform, AWS or Azure • 2+ years of experience in data or software development • Knowledge of big data platforms • Knowledge of relational databases • Experience with Git, Docker, Bash • Ability to propose, design and implement simple ETL solutions in batch and real-time • Understanding of continuous delivery pipelines and ability to design a process • Ability to pick the correct technology for the correct task • Experience with DBT (Data Build Tools) (desirable) • Experience with Dataflow or Apache Beam (desirable) • Experience using Airflow (desirable) • Experience with NoSQL databases like Redis, Elastic Search (desirable) • English daily communication

🏖️ Benefits

• Flexible work environment (office, home, or blend) • Remote-first approach • Part of a global team with daily English communication • Well-being-focused trimester • Fitness offerings • Mental health plans (country-dependent) • Generous time off • Career growth and development opportunities across projects, offerings, and industries • Equal opportunity employer with inclusive values

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com