Senior Data Engineer

Job not on LinkedIn

September 21

Apply Now
Logo of Under Armour

Under Armour

B2C • Retail • Sports

Under Armour is a leading sports apparel and footwear company that specializes in innovative performance gear. Their products are designed to enhance athletic performance, providing comfort and functionality for athletes and fitness enthusiasts. Under Armour focuses on a wide range of categories, from casual wear to specialized work clothing, ensuring that their offerings meet the varied needs of their customers.

10,000+ employees

Founded 1996

👥 B2C

🛒 Retail

⚽ Sports

💰 $100M Post-IPO Secondary on 2023-05

📋 Description

• Work across teams to build pipelines, services, and tools that enable UA teammates and integrated systems with the data, information, and knowledge to fulfill Under Armour’s mission • Design and build data products and data flows for the continued expansion of the data platform mission; deployed code is performant, well-styled, validated, and documented • Implement data platform architectural designs/approaches • Instrument and monitor the systems and products of the data platform • Design, build, integrate, and maintain data ingestion pipelines from 3rd Party source systems and/or data providers • Design, build, integrate, and maintain data ingestion pipelines from offline sources, to include various 3rd Party services • Design and integrate data replication solutions from various UA enterprise or other back end services to the Data Platform • Support data catalog initiatives and adoption across the Data Platform user community • Support data cleansing and data quality initiatives across the Data Platform for its foundational data • Support identity resolution initiatives across the Data Platform • Lead one or two medium or large projects at a time • Other data engineering duties as assigned

🎯 Requirements

• Bachelor's degree in Computer Science, Information Systems, or in closely related technical field plus 5 years of progressively responsible data and/or software engineering experience OR master's degree in Computer Science, Information Systems, or in closely related technical field plus 3 years of progressively responsible data and/or software engineering experience • 2 years of experience with Data engineering fundamentals (design patterns, common practices) • 2 years of experience building high volume data products and services • 2 years of experience with Security and privacy 'by design’ frameworks • 2 years of experience with Data science lifecycle • 2 years of programming in SQL • 2 years of programming in Python • 2 years of experience with Snowflake • Demonstrated knowledge of Job Orchestration Tools such as Airflow or similar tools • Demonstrated knowledge of AWS data-related products such as EMR, Glue, S3, and/or Lambda • Experience with data and/or software engineering with progressive responsibility

🏖️ Benefits

• Paid "UA Give Back" Volunteer Days: Work alongside your team to support initiatives in your local community • Under Armour Merchandise Discounts • Competitive 401(k) plan matching • Maternity and Parental Leave for eligible and FMLA-eligible teammates • Health & fitness benefits, discounts and resources- programs to promote physical activity and overall well-being

Apply Now

Similar Jobs

September 20

Build scalable ETL/ELT pipelines and data models for BI at Fusion Connect, ensuring data quality and cloud integration.

AWS

Azure

Cloud

ERP

ETL

Google Cloud Platform

Python

SQL

Tableau

September 20

Data Engineer building and maintaining Python/Spark pipelines and data quality for Veeva's OpenData reference datasets in the life sciences sector

Airflow

AWS

Cloud

Java

Python

Spark

SQL

September 19

NBCUniversal

10,000+ employees

📱 Media

Senior Data Engineer building data platforms and serverless pipelines for NBCUniversal. Designs ETL, CI/CD, IaC, and works with analytics and ad-tech datasets.

Amazon Redshift

AWS

Cloud

ETL

Kafka

Linux

MySQL

Oracle

Pandas

Postgres

Python

Scala

Spark

SQL

Tableau

Terraform

September 17

Lead secure AWS IaC, CI/CD, containerization, and data pipelines for Via Logic serving DHS and other federal clients.

Ansible

AWS

Chef

Cloud

Cyber Security

Docker

DynamoDB

EC2

ETL

Java

Jenkins

Kubernetes

Microservices

Oracle

Postgres

Python

Redis

SQL

Tableau

Terraform

September 17

Manager, Data Engineering leading data platform at context-based insurance company. Build scalable data pipelines, mentor engineers, and integrate DevSecOps for AWS cloud-native environments

AWS

Cloud

Docker

ETL

Kubernetes

Python

Terraform

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com