Senior Data Engineer – Databricks

Job not on LinkedIn

September 10

Apply Now
Logo of Look4IT

Look4IT

Recruitment • IT • SaaS

Look4IT is a recruitment agency specialized in the IT sector, focusing exclusively on providing staffing solutions for permanent and contract positions. Established in 2012, the company leverages over a decade of experience and a database of over 270,000 candidates, ensuring quality and speed in matching skilled IT professionals with employers. Look4IT offers a wide range of recruitment services, including technical screenings, outsourcing, and process automation, tailored specifically to the unique needs of the IT industry.

11 - 50 employees

Founded 2012

🎯 Recruiter

☁️ SaaS

📋 Description

• Implementation of core features under the guidance of Technical Lead/Engineering Lead • Conduct code reviews for members in the team • Collaborate with different teams across the company to incorporate customer feedback and provide elegant solutions • Be part of an on-call rotation to support our live products (further details will be provided during the interview process) • Build new data products on AWS/Databricks • Manage Databricks platform deployment • Create guide rails and best practices for using Databricks in the company • Evaluate new Databricks offerings and how they can be leveraged • Reduce ambiguity in complex problem spaces by leading technical discovery and prototyping efforts that have a strategic impact on the team • Identify as well as investigate key problem or opportunity spaces and formulate recommendations and strategies for whether and how to pursue these • Prepare design docs, implementation strategy and choose appropriate tools • Hands-on work with live production systems • Monitoring of production infrastructure (Datadog)

🎯 Requirements

• 5+ years of experience in Java/Scala or Python • Bachelor’s or higher degree in Computer Science, Software Engineering, or a related field • Fluency in English, it’s our daily business language • Knowledge in Infrastructure as Code tooling, e.g. Terraform, Ansible • Good knowledge of and experience with operating AWS Services and Networking, e.g. S3, EC2, SG, VPC, ASG, R53, RDS, etc. • Experience in building data pipelines or data products using Databricks • Experience in managing Databricks as a platform • Experience with Spark/PySpark • Experience with Streaming technologies, e.g. Kafka Streams • Effective communication and teamwork skills

🏖️ Benefits

• High spec Macbook, 34" monitor and home office budget • Multisport Plus card • Private health insurance and life insurance • Access AWS courses and grow your expertise, with rewards for certifications along the way

Apply Now

Similar Jobs

September 4

Data Engineer building Microsoft Fabric, Delta Lake and Apache Spark pipelines on Azure for a global professional services firm. Design, optimize and secure scalable data transformation workflows.

Apache

Azure

PySpark

Python

Scala

Spark

July 5

Work as a Cloud Data Architect in GCP, providing data solutions for clients.

🗣️🇵🇱 Polish Required

Apache

BigQuery

Cloud

ETL

Google Cloud Platform

SQL

Terraform

June 25

Join Data Solutions as a Senior Cloud Data Engineer. Focus on GCP data platforms and ETL/ELT processes.

🗣️🇵🇱 Polish Required

Airflow

BigQuery

Cloud

ETL

Google Cloud Platform

Python

Roku

SQL

May 27

Join a remote team as a Data Engineer, developing data lakes and real-time solutions with Azure.

🗣️🇵🇱 Polish Required

Azure

Cloud

Docker

ETL

Kubernetes

Linux

Node.js

Numpy

Pandas

PySpark

Python

Spark

SQL

April 17

Seeking a Data Engineer at Provectus to tackle technical challenges in data and ML.

Airflow

Apache

AWS

Azure

Cloud

ETL

Flask

Google Cloud Platform

Kafka

Open Source

Python

Spark

SQL

Tableau

Terraform

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com