Senior Engineer, Data Management

Job not on LinkedIn

October 14

Apply Now
Logo of Abercrombie & Fitch Co.

Abercrombie & Fitch Co.

Retail • Fashion • B2C

Abercrombie & Fitch Co. is a leading global omnichannel specialty retailer offering apparel and accessories for men, women, and kids. The company's brands, including Abercrombie & Fitch, YPB, abercrombie kids, Hollister, and Gilly Hicks, focus on providing high-quality, comfortable products that enable consumers worldwide to express their individuality and style. Abercrombie & Fitch Co. places a strong emphasis on digital transformation, sustainability, inclusion, and diversity. The company has been recognized as a great place to work for LGBTQ+ equality and is committed to community partnerships and global giving.

10,000+ employees

Founded 1892

🛒 Retail

👗 Fashion

👥 B2C

📋 Description

• Team up with the engineering teams and enterprise architecture (EA) to define standards, design patterns, accelerators, development practices, DevOps and CI/CD automation • Create and maintain the data ingestion, quality testing and audit framework • Conduct complex data analysis to answer the queries from Business Users or Technology team partners either directly from Analysts or stemmed from one of the Reporting tools such as PowerBI, Tableau, Looker etc. • Build and automate the data ingestion, transformation and aggregation pipelines using Azure Data Factory, Databricks / Spark, Snowflake, Kafka as well as Enterprise Scheduler tools such as CA Workload automation or Control M • Setup and evangelize the metadata driven approach to data pipelines to promote self service • Setup and continuously improve the data quality and audit monitoring as well as alerting using applications such as Sod. • Constantly evaluate the process automation options and collaborate with platforms engineering as well as architecture to review the proposed design. • Demonstrate mastery of build and release engineering principles and methodologies including source control, branch management, build and smoke testing, archiving and retention practices • Adhere to and enhance and document the design principles, best practices by collaborating with Solution and in some cases Enterprise Architects • Participate in and support the Data Academy and Data Literacy program to train the Business Users and Technology teams on Data • Respond SLA driven production data quality or pipeline issues • Work in a fast-paced Agile/Scrum environment • Identify and assist with implementation of DevOps practices in support of fully automated deployments • Document the Data Flow Diagrams, Data Models, Technical Data Mapping and Production Support Information for Data Pipelines • Follow the Industry standard data security practices and evangelize the same across the team.

🎯 Requirements

• 5+ years of experience in an Enterprise Data Management or Data Engineering role • 5+ of hands on experience in building metadata driven data pipelines using Azure Data Factory, Databricks / Spark for Cloud Datalake • 5+ years hands on experience with using one or more of the following for data analysis and wrangling Databricks, Python / PySpark, Jupyter Notebooks • Expert level SQL knowledge on databases such as but not limited to Snowflake, Netezza, Oracle, Sql Server, MySQL, Teradata • Experience working in a multi developer environment and hands on experience in using either azure devops, gitlab • Preferably experienced in SLA driven Production Data Pipeline or Quality support • Experience or strong understanding of the traditional enterprise ETL platforms such as IBM Datastage, Informatica, Pentaho, Ab Initio etc. • Functional knowledge of some of the following technologies - Terraform, Azure CLI, PowerShell, Containerization (Kubernetes, Docker) • Functional knowledge of one or more Reporting tools such as PowerBI, Tableau, OBIEE • Team player with excellent communication skills, ability to communicate with the customer directly and able to explain the status of the deliverables in scrum calls • Ability to implement Agile methodologies and work in an Agile DevOps environment • Bachelor’s degree in Computer Science or Engineering or Mathematics or related field and 5+ years of experience in various cloud technologies within a large-scale organization

🏖️ Benefits

• Incentive bonus program • 401(K) savings plan with company match • Annual companywide review process • Flexible spending accounts • Medical, dental and vision insurance • Life and disability insurance • Associate assistance program • Paid parental and adoption leave • Access to fertility and adoption benefits through Carrot • Access to mental health and wellness app, Headspace • Paid Caregiver Leave • Mobile Stipend • Paid time off and one paid volunteer day per year, allowing you to give back to your community • Work from anywhere (Mondays and Fridays are “work from anywhere” days for most roles and six work from anywhere weeks per year) • Seven associate wellness half days per year • Merchandise discount on all of our brands • Opportunities for career advancement, we believe in promoting from within • Access to multiple Associate Resource Groups • Global team of people who will celebrate you for being YOU!

Apply Now

Similar Jobs

October 14

Software Engineer developing mobile applications at SmithRx, a Health-Tech company disrupting the Pharmacy Benefit Management sector. Leading mobile app lifecycle and collaborating with stakeholders for best practices.

Android

AWS

Docker

GraphQL

iOS

JavaScript

Kubernetes

Node.js

Postgres

React

React Native

SQL

October 14

Growth Engineer at Polycam empowering users with 3D capture technology and driving growth through engineering, data, and experimentation.

BigQuery

React

SQL

TypeScript

October 13

Lead Software Engineer for Disney's Roku Client Application Engineering Team. Work on Direct-to-Consumer client apps like Disney+, Star+, ESPN, and Hulu.

AWS

JavaScript

Jenkins

Node.js

React

Roku

October 13

Software Engineer specializing in building cloud infrastructure for AI-native applications at Render. Designing and innovating solutions for a scalable platform used by millions of developers.

Ansible

Cloud

Grafana

Kubernetes

Linux

Postgres

Terraform

October 13

Senior Software Engineer developing the platform that supports web and mobile applications at Polycam. Collaborating with senior engineers to ensure scalability and efficiency in cloud infrastructure.

Cloud

Docker

Firebase

Google Cloud Platform

NoSQL

React

Terraform

TypeScript

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com