Senior Data Engineer – Enterprise Analytics

Job not on LinkedIn

October 10

Apply Now
Logo of Contentsquare

Contentsquare

Artificial Intelligence • SaaS • eCommerce

Contentsquare is a digital experience analytics platform that utilizes AI technology to provide actionable insights for improving online engagement and conversion rates. The platform offers a comprehensive suite of features including experience analytics, product analytics, voice of customer insights, and experience monitoring, aimed at enhancing user journeys across digital interfaces. Trusted by over 3,700 enterprise brands, Contentsquare helps clients understand customer behavior and optimize digital experiences to boost business impact. It integrates with various systems to provide seamless analytics across websites and mobile apps, leveraging data to maximize customer engagement.

📋 Description

• Build, test, document, and deploy reliable data pipelines, working with diverse systems. • Proactively contribute to the evolution of our Data Analytics Platform by proposing new ideas and exploring innovative big data tools. Our Platform is a critical product used by multiple departments across the company. • Build and maintain our data infrastructure using DevOps best practices, ensuring scalability, reliability, and security. • Effectively communicate and collaborate with a broad audience, tailoring your message to individuals with varying levels of technical expertise.

🎯 Requirements

• Strong strategic thinking skills, demonstrating an understanding of analytics needs in the business domain and proposing appropriate technical solutions. • Autonomy and excellent people and time management skills to efficiently prioritize tasks and collaborate effectively within a distributed team mostly communicating async. • Mentoring skills, helping the rest of the team grow technically and professionally. • Ability to think outside the box, finding the most suitable solutions to complex engineering problems. • Effective oral and written communication in English to engage with a diverse audience, including both technical and non-technical stakeholders. • Technical skills: 5+ years experience in Backend/Data Engineering roles • Proficiency in Python and SQL • Strong systems design skills with a demonstrated ability to architect, build, and maintain scalable and reliable data pipelines. • Deep knowledge of data warehouses (e.g. Snowflake, Clickhouse…) • Production experience with data orchestration solutions (e.g. Dagster, Airflow…) • Experience deploying and maintaining containerized services on Cloud providers using Infrastructure as Code

🏖️ Benefits

• Virtual onboarding, Hackathon, and various opportunities to interact with your team and global colleagues both on and offsite each year • Work flexibility: hybrid and remote work policies • Generous paid time-off policy (every location is different) • Immediate eligibility for birthing and non-birthing parental leave • Wellbeing and Home Office allowances • A Culture Crew in every country we’re based in to coordinate regular activities for employees to get to know each other and bond outside of work • Every full-time employee receives stock options, allowing them to share in the company’s success • We have multiple Employee Resource Groups, that offer a safe space for individuals who share common identities, life experiences, or allyship to connect, support one another, and passionately advocate for the issues close to their hearts • And more benefits tailored to each country

Apply Now

Similar Jobs

September 21

Data Engineer on GCP building and operating data products at T-Systems. Develop infrastructure, data pipelines, CI/CD, security, and AI solutions using Terraform, Python, BigQuery, and Vertex AI.

Airflow

BigQuery

Cloud

DNS

Firewalls

Google Cloud Platform

Hadoop

PySpark

Python

Terraform

September 19

Design and evolve dLocal's scalable data platform for payments in emerging markets. Mentor engineers and drive data governance and architecture decisions.

Airflow

Apache

AWS

Cloud

Google Cloud Platform

Python

Spark

SQL

September 17

Build and maintain scalable ETL/ELT pipelines and lakehouse infrastructure; enable AI-driven analytics and data governance for Rithum's commerce platform.

Airflow

Amazon Redshift

Apache

AWS

Cloud

Docker

ETL

Kafka

Kubernetes

RDBMS

Spark

SQL

September 17

Senior Data Engineer building AI-enhanced data infrastructure for Rithum’s commerce network. Design scalable ETL/ELT pipelines and mentor engineers.

Airflow

Amazon Redshift

Apache

AWS

Cloud

Docker

ETL

Kafka

Kubernetes

RDBMS

Spark

SQL

August 20

Mid Data Engineer building data architecture and storage solutions for Volkswagen Group Services. Lead technical data strategy and implement cloud-based data platforms.

AWS

Azure

Cloud

NoSQL

Spark

SQL

Vault

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com