Data Engineer

4 days ago

Apply Now
Logo of Bizimply

Bizimply

HR Tech • SaaS • Productivity

Bizimply is a comprehensive workforce management solution designed to streamline HR tasks, employee scheduling, and communication within businesses. Trusted by thousands of UK and Irish businesses, Bizimply simplifies building employee rotas, tracking time and attendance, and managing shift schedules across multiple locations. The platform enhances team productivity and satisfaction by offering tools for simplifying HR operations, labor cost optimization, and real-time monitoring of employee performance. Key features include seamless payroll integration, the ability to clock in and out via mobile devices, instant messaging through the MyZimply app, and comprehensive operations management. Bizimply provides an efficient, user-friendly solution for managing both hourly and salaried teams, allowing businesses to remain organized, compliant, and productive.

11 - 50 employees

👥 HR Tech

☁️ SaaS

⚡ Productivity

💰 Series A on 2016-04

📋 Description

• Build and maintain robust, scalable data pipelines using Airflow, Python, and SQL. • Design, build, optimize data pipelines using dbt, or sqlmesh. • Develop and manage the ODS and support the eventual rollout of a modern Data Warehouse. • Integrate data from internal systems and external APIs to create clean, reliable datasets. • Work closely with engineers to operationalize machine learning workflows. • Ensure high data quality through monitoring, validation, and error handling. • Provide guidance to less experienced team members and champion data engineering best practices. • Deploy and manage infrastructure in the cloud (AWS, GCP, or Azure) using modern DevOps tooling. • Implement monitoring and alerting to ensure data pipelines are reliable and maintainable.

🎯 Requirements

• 3-5 years of experience in data engineering or related roles. • Strong skills in Python, SQL, and Airflow or similar orchestration tools. • Experience working with cloud infrastructure and data warehousing tools (e.g., Snowflake, BigQuery, Redshift). • Exposure to ML pipelines or collaboration with ML/AI teams. • Ability to work independently while supporting a less-experienced team. • Strong communication skills and an eagerness to mentor and share knowledge. • Experience building an ODS or Data Warehouse from scratch. • Familiarity with event-driven systems or streaming tools (e.g., Kafka, Pub/Sub). • DevOps experience or infrastructure-as-code (e.g., Terraform, CloudFormation).

🏖️ Benefits

• - Competitive compensation aligned with relevant experience. • - Remote-friendly, flexible work environment. • - Budget for learning, courses, and conferences. • - A supportive, mission-driven team eager to grow and learn together.

Apply Now

Similar Jobs

4 days ago

CDP Data Engineer working on building modern data ecosystems for Software Mind with cloud technologies and analytical solutions.

Airflow

AWS

Cloud

Kafka

Python

Spark

SQL

November 25

Data Engineering Tech Lead overseeing data engineering team and guiding project execution in Poland. Focusing on data solutions and mentoring engineers in a collaborative environment.

Azure

Cloud

ETL

Python

Scala

SQL

November 23

Senior Data Engineer at Wavestone supporting clients with data engineering and analytics in cloud environments. Collaborating on Data Intelligence solutions and being a technical expert in Azure.

🗣️🇩🇪 German Required

🗣️🇵🇱 Polish Required

Azure

ETL

SDLC

SQL

November 22

Senior Data Architect designing scalable, cloud-native data architectures for a leader in global media. Working on the Global Data Platform Modernization project and integrating AI and analytics.

Airflow

Apache

AWS

BigQuery

Cloud

Google Cloud Platform

Python

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com