Data Engineer

Job not on LinkedIn

8 hours ago

Apply Now
Logo of Datatonic

Datatonic

Artificial Intelligence • eCommerce • Telecommunications

Datatonic is your go-to AI partner that addresses complex business challenges through advanced cloud data and AI consultancy. They offer a range of services including generative AI applications for various industries such as Retail, Telecommunications, Gaming, and Financial Services. With a focus on data-driven decision-making and analytics, Datatonic empowers businesses to optimize their operations and enhance customer engagement across multiple sectors.

51 - 200 employees

Founded 2013

🤖 Artificial Intelligence

🛍️ eCommerce

📡 Telecommunications

đź’° Pre Seed Round on 2013-01

đź“‹ Description

• Build the infrastructure that enables analytics and data science teams to deliver innovative, impactful solutions for clients. • Assist clients in migrating their existing business intelligence and data warehouse solutions to Google Cloud. • Develop, and optimize robust data pipelines, making data easily accessible for visualization and machine learning applications. • Develop and implement new data warehouse and data mart solutions, including: • - Transforming, testing, deploying, and documenting data. • - Understanding data modeling techniques. • - Optimising and storing data for warehouse technologies. • Build, maintain, and troubleshoot cloud-based infrastructure to ensure high availability and performance. • Work closely with technology partners such as Google Cloud, Snowflake, dbt, and Looker, mastering their technologies and building a network with their engineers. • Collaborate in an agile and dynamic environment with a team of data engineers, BI analysts, data scientists, and machine learning experts. • Implement software engineering best practices to analytics processes, such as version control, testing, and continuous integration.

🎯 Requirements

• 2+ years in a data-related role (e.g., Data Engineer, Data Analyst, Analytics Engineer). • Hands-on experience with Looker, dbt, modern data warehouses like Snowflake or BigQuery, and Kimball data modeling. • Expertise in Python and/or Java, with proficiency in SQL. • 2+ years of experience in developing and building scalable data solutions. • Ability to write tested, resilient, and well-documented code. • Experience in building and maintaining cloud infrastructure (GCP or AWS is a plus). • Ability to take ownership and support projects from concept to completion. • Natural ability to manage multiple initiatives and clients simultaneously. • Skilled in writing analytical SQL, with an understanding of the difference between SQL that works and performant SQL. • Experience in translating business requirements into technical solutions. • Ability to communicate complex ideas simply to a wide range of audiences. • Complete alignment with our culture of transparency, empathy, accountability, and performance.

🏖️ Benefits

• 20 days of paid vacation per calendar year • Public Holidays for your Province of Residence • 5 Wellness days (sickness, personal time, mental health) • 5 Lifestyle days (religious events, volunteer day, sick day) • Matching Group Retirement Savings Plan after 3 months • Competitive Group Insurance plan on Day 1 - individual premium paid 100%! • Virtual Medicine and Family Assistance Program - 100% employer-paid! • Home office budget - We are 100% remote! • CAD $70/month for internet/phone expenses • CAD $1,500 every 3 years for tech accessories and office equipment (monitor, keyboard, mouse, desk, etc.) starting on Day 1 • Company-supplied MacBook Pro or Air • CAD $400/year for books, relevant app subscriptions or an e-reader. • Opportunities for paid certifications • Opportunities for professional and personal learning through Google and other training programs • Regular company off-sites and meetups

Apply Now

Similar Jobs

9 hours ago

Junior Data Engineer at KOHO, a fintech company, developing data pipelines and ensuring data quality for their financial services platform.

Airflow

Amazon Redshift

Apache

AWS

Cloud

ETL

Python

Spark

SQL

Terraform

Yesterday

Fluent, Inc

201 - 500

Data Engineer responsible for designing scalable data pipelines at performance marketing company. Collaborating with clients and internal teams to translate data into actionable insights.

AWS

Azure

ETL

Google Cloud Platform

PySpark

Python

SQL

Tableau

3 days ago

Data Engineer at eServices focuses on creating data solutions for accessibility and analysis. Collaborates with teams to enhance data quality and support business decisions.

Amazon Redshift

AWS

Azure

Cloud

ETL

Google Cloud Platform

Open Source

Oracle

Spark

November 20

Hopper

201 - 500

Data Engineer responsible for building robust data pipelines and analytics systems for Hopper’s advertising business. Collaborate with engineering teams to ensure data integrity and enable insights.

Airflow

Amazon Redshift

BigQuery

Cloud

ETL

Kafka

Python

Scala

SQL

November 19

Data Engineer designing and maintaining data sets and data warehouses for a home care app. Collaborating with cross-functional teams to enhance data practices and quality at Thumbtack.

Airflow

Apache

BigQuery

Cloud

ETL

Kafka

Python

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com