Data Engineer

Job not on LinkedIn

October 6

Apply Now
Logo of TIP Group

TIP Group

Transport

TIP Group is a company specializing in truck and trailer leasing, rental, maintenance, and repair services. Operating in various countries across Europe and North America, TIP Group offers a flexible range of rental and leasing solutions tailored to meet diverse client needs. In addition to leasing and rental, they provide digital telematics-based services through TIP Insight, as well as sales of used vehicles and eco-friendly initiatives aligned with their sustainability goals. TIP Group is committed to delivering top-class maintenance with their certified technicians and workshops, and actively works on extending their services through strategic acquisitions, such as the Trailer Auto Group, enhancing their capabilities in refurbishment services.

1001 - 5000 employees

đźš— Transport

đź“‹ Description

• Design, develop, and maintain a data lake/warehouse platform ensuring reliability, scalability, performance, and security. • Build and optimize data pipelines to ingest, process, and consolidate data from multiple sources, including both third-party systems and internal applications. • Implement and support both real-time streaming and batch data processing frameworks to keep the data platform up-to-date and analytics-ready at all times. • Develop complex code-based ETL/ELT data pipelines with performance optimised data modelling. • Collaborate with stakeholders across engineering, analytics, product, and business teams to define and execute data platform roadmaps, balancing business needs with technical sustainability. • Champion best practices in data governance, quality, lineage, privacy, and security for all workflows and datasets. • Proactively troubleshoot and resolve data issues, automate recurring processes, and optimize operational efficiency at scale. • Continually research and adopt new data engineering methods and technologies, especially within the GCP ecosystem, for real-time and batch processing.

🎯 Requirements

• 5+ years' hands-on experience in data engineering, with a proven track record of architecting and operating enterprise-scale data platforms. • Building and deploying data lakes and data warehouses on GCP using services including BigQuery, Datastream, Dataflow, DataFusion, Pub/Sub, Composer(Airflow), CloudSQL, Cloud Functions and GCS. • Good-level knowledge of PostgreSQL, Python and SQL. • Demonstrated experience integrating, transforming, and loading data from multiple heterogeneous sources into a unified warehouse environment. • The ability to engage with senior level stakeholders. • Strong collaborator, able to work cross-functionally to understand business objectives and translate requirements into robust data platform solutions. • Experience with data security, privacy, and compliance frameworks.

🏖️ Benefits

• Equity options - share in the company’s success. • Remote-first flexibility: choose remote, hybrid, or a setup that works best for you. • A team that genuinely loves working here: we scored 100% eNPS in 2023. • Regular company events - from team dinners to Pilates sessions and bakery runs.

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com