We believe a run can change a day, a life, the world
Running Shoes • Running Apparel • Bras • Apparel • Retail
1001 - 5000
April 25
Loading...
We believe a run can change a day, a life, the world
Running Shoes • Running Apparel • Bras • Apparel • Retail
1001 - 5000
• Collaborate with Lead Data Engineers and domain Architect to design, build, optimize the data architecture and extract, transform, load (ETL) pipelines to make them accessible for the data users • Own and support data engineering platform tools using technologies like Snowflake, AWS, HVR, KNIME, etc • Design robust ETL pipelines of medium complexity adhering to existing patterns while keeping performance, uptime, reduced technical debt, scalability, and extensibility • Develop and perform unit tests, maintain up to date code in source control • Collaborate with other Data Engineers for code review and participate in pair programming when needed • Independently troubleshoot issues reported by users and errors from ETL jobs with minimal guidance, participate in on-call rotation and perform root cause analysis • Deliver quality code, follow best practices and standards keeping performance & Scalability in mind to keep cost in check in Cloud environment • Partner with Platform Product Manager to prioritize and deliver high-quality data products, working in an agile team • Live the culture of sharing, re-use, design for scale and stability, and operational efficiency of data and analytical solutions. Demonstrate the passion for innovation and continuous improvement • Maintain awareness of advancements, and changes in technologies relating to data engineering and cloud data platforms • Bring a positive Run Happy energy and work with the team to deliver the best possible solutions • Learn the business, learn the data that supports the business; be a partner – don’t just implement technology • Live Brooks’ values • Other responsibilities as required
• Bachelor’s degree or equivalent work experience in Computer Science, Engineering, Math, Information Systems, or related disciplines • 5+ years of professional experience in data warehouse ETL or Data Engineering development • 3 - 5 years of hands-on experience with Python and solid understanding of Object-Oriented Programming Language concepts • Proficient in SQL and experience in working complex transformations • 2-3 years of experience working in Cloud data platforms (Snowflake, Microsoft Fabric preferred) • Experience working with source control tools (Git/Bitbucket) • 3+ years of current professional experience in cloud-based ETL data engineering development (AWS preferred). Certification is a plus • Solid understanding of data modeling and data architecture concepts • Experience working in data orchestration and transformation tools (Airflow and dbt preferred) • Experience with analyzing data, unit testing & data quality validation • Excellent verbal and written communication skills, demonstrating effective listening through concise, clear verbal and written communication • Good interpersonal skills and demonstrated problem-solving skills • Embraces and lives the Brooks values
• Competitive salary • Health benefits • Paid time off • Employee discounts
Apply NowApril 25
April 25
51 - 200
🇺🇸 United States – Remote
💵 $120k / year
💰 $25M Series B on 2022-07
⏰ Full Time
🟡 Mid-level
🟠 Senior
🚰 Data Engineer
🗽 H1B Visa Sponsor