Senior Data Engineer

Job not on LinkedIn

3 hours ago

Apply Now
Logo of Datatonic

Datatonic

Artificial Intelligence • eCommerce • Telecommunications

Datatonic is your go-to AI partner that addresses complex business challenges through advanced cloud data and AI consultancy. They offer a range of services including generative AI applications for various industries such as Retail, Telecommunications, Gaming, and Financial Services. With a focus on data-driven decision-making and analytics, Datatonic empowers businesses to optimize their operations and enhance customer engagement across multiple sectors.

51 - 200 employees

Founded 2013

🤖 Artificial Intelligence

🛍️ eCommerce

📡 Telecommunications

💰 Pre Seed Round on 2013-01

📋 Description

• Foundational Support for Analytics and Data Science Teams: Build the infrastructure that enables analytics and data science teams to deliver innovative, impactful solutions for clients. • Google Cloud Migration and Data Warehouse Solutions: Assist clients in migrating their existing business intelligence and data warehouse solutions to Google Cloud. • Build Scalable Data Pipelines: Design, develop, and optimize robust data pipelines, making data easily accessible for visualization and machine learning applications. • Design and Build Data Warehouses and Data Marts: Design and implement new data warehouse and data mart solutions, including: • - Transforming, testing, deploying, and documenting data. • - Understanding data modeling techniques. • - Optimising and storing data for warehouse technologies. • Manage Cloud Infrastructure: Architect, maintain, and troubleshoot cloud-based infrastructure to ensure high availability and performance. • Collaboration with Technology Partners: Work closely with technology partners such as Google Cloud, Snowflake, dbt, and Looker, mastering their technologies and building a network with their engineers. • Agile and Dynamic Team Collaboration: Collaborate in an agile and dynamic environment with a team of data engineers, BI analysts, data scientists, and machine learning experts. • Applying Software Engineering Best Practices: Implement software engineering best practices to analytics processes, such as version control, testing, and continuous integration.

🎯 Requirements

• Experience: 4+ years in a data-related role (e.g., Data Engineer, Data Analyst, Analytics Engineer). • Technical Expertise: Hands-on experience with Looker, dbt, modern data warehouses like Snowflake or BigQuery, and Kimball data modeling. • Strong Programming Skills: Expertise in Python and/or Java, with proficiency in SQL. • Experience in Data Engineering: 5+ years of experience in designing and building scalable data solutions. • High-Quality Code Standards: Ability to write tested, resilient, and well-documented code. • Cloud Computing Experience: Experience in building and maintaining cloud infrastructure (GCP or AWS is a plus). • Problem-Solving Mindset: Ability to take ownership and drive projects from concept to completion. • Project Management: Natural ability to manage multiple initiatives and clients simultaneously. • SQL Proficiency: Skilled in writing analytical SQL, with an understanding of the difference between SQL that works and performant SQL. • Business Translation: Experience in translating business requirements into technical solutions. • Communication Skills: Ability to communicate complex ideas simply to a wide range of audiences. • Leadership: Experience in providing technical guidance and direction on projects. • Cultural Alignment: Complete alignment with our culture of transparency, empathy, accountability, and performance.

🏖️ Benefits

• 20 days of paid vacation per calendar year • Public Holidays for your Province of Residence • 5 Wellness days (sickness, personal time, mental health) • 5 Lifestyle days (religious events, volunteer day, sick day) • Matching Group Retirement Savings Plan after 3 months • Competitive Group Insurance plan on Day 1 - individual premium paid 100%! • Virtual Medicine and Family Assistance Program - 100% employer-paid! • Home office budget - We are 100% remote! • CAD $70/month for internet/phone expenses • CAD $1,500 every 3 years for tech accessories and office equipment (monitor, keyboard, mouse, desk, etc.) starting on Day 1 • Company-supplied MacBook Pro or Air • CAD $400/year for books, relevant app subscriptions or an e-reader. • Opportunities for paid certifications • Opportunities for professional and personal learning through Google and other training programs • Regular company off-sites and meetups

Apply Now

Similar Jobs

23 hours ago

Fluent, Inc

201 - 500

Data Engineer responsible for designing scalable data pipelines at performance marketing company. Collaborating with clients and internal teams to translate data into actionable insights.

AWS

Azure

ETL

Google Cloud Platform

PySpark

Python

SQL

Tableau

3 days ago

Data Engineer at eServices focuses on creating data solutions for accessibility and analysis. Collaborates with teams to enhance data quality and support business decisions.

Amazon Redshift

AWS

Azure

Cloud

ETL

Google Cloud Platform

Open Source

Oracle

Spark

6 days ago

Team Lead overseeing a high-performing data engineering team at Q4, an AI-driven investor relations platform. Responsible for building data pipelines and mentoring team members.

Amazon Redshift

AWS

Cassandra

EC2

ETL

NoSQL

Postgres

SDLC

SQL

November 25

Senior Data Engineer responsible for building scalable data solutions and supporting teams leveraging data at Jobber. Transforming operations and enhancing workflows within a cloud infrastructure.

Airflow

Amazon Redshift

AWS

Cloud

ETL

Python

Spark

SQL

Terraform

November 22

Senior Data Engineer designing and implementing data warehouses and pipelines for Leap Tools. Collaborating with engineering, ML, and product teams on data strategy.

Distributed Systems

Python

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com