
eCommerce • Retail • B2C
Balsam Brands is a global eCommerce retailer specializing in holiday and home décor. Founded in 2006 and headquartered in multiple locations including Boise, Redwood City, Ireland, and the Philippines, it is known for its flagship brand, Balsam Hill™, the leading retailer of artificial Christmas trees. Balsam Brands places strong emphasis on its people-first company culture, relationship building, and sharing joy through its products. It has expanded its product offerings to include fresh and preserved greenery, gourmet food and gifts, and has established new offices in Canada and Mexico City. The company values authenticity and doing the right thing, fostering a unique work environment that encourages personal growth and joy.
October 24

eCommerce • Retail • B2C
Balsam Brands is a global eCommerce retailer specializing in holiday and home décor. Founded in 2006 and headquartered in multiple locations including Boise, Redwood City, Ireland, and the Philippines, it is known for its flagship brand, Balsam Hill™, the leading retailer of artificial Christmas trees. Balsam Brands places strong emphasis on its people-first company culture, relationship building, and sharing joy through its products. It has expanded its product offerings to include fresh and preserved greenery, gourmet food and gifts, and has established new offices in Canada and Mexico City. The company values authenticity and doing the right thing, fostering a unique work environment that encourages personal growth and joy.
• Design and implement robust, scalable, and high-performance data solutions using Snowflake, dbt, and Python. • Be accountable for building and maintaining the organization’s data infrastructure. • Champion the data warehouse by creating denormalized data foundation layers and normalized data marts. • Work on all aspects of the data warehouse/BI environment including architecture, design, development, automation, caching, and performance tuning. • Build infrastructure for optimal extraction, transformation, and loading (ETL) of data from various sources using SQL and cloud data platforms like Snowflake. • Define strategies to capture all data sources and assess the impact of business process changes on data inputs. • Lead the migration of existing data platforms to Snowflake, ensuring minimal disruption to business operations. • Manage the full lifecycle of data within Snowflake, from ingestion and storage to analytics and reporting. • Conduct performance tuning and troubleshooting of the Snowflake environment to ensure optimal efficiency. • Identify, design, and implement internal process improvements, such as re-architecting for scalability, optimizing data delivery, and automating manual processes. • Collaborate with systems analysts and cross-functional partners to understand data requirements. • Work with stakeholders including Executive, Product, Data, and Design teams to support data infrastructure needs and resolve data-related technical issues. • Continually explore emerging technologies such as Big Data, Artificial Intelligence, Generative AI, Machine Learning, and Predictive Data Modeling to enhance data capabilities.
• 8+ years of professional experience in the data engineering field • Hands-on polyglot programming expertise - hands-on, current Python experience is a must-have • Marketing channel data automation, pipeline monitoring and data delivery is strongly preferred • Extensive experience in designing, developing Snowflake Cloud Data Platform • Proficiency in multi-cloud platform like AWS, Azure, and/or GCP • Proficiency in designing and implementing data pipelines using diverse data sources including databases, APIs, external data providers, and streaming sources • Demonstrated history of designing efficient data models using Medallion Architecture • Deep understanding and experience with relational (SQL Server, Oracle, Postgres and MySQL) and NoSQL databases • Experience building and supporting REST APIs for both inbound and outbound data workflows • Proficiency and solid grasp of distributed system concepts to design scalable and fault tolerant data architectures • Excellent critical thinking to perform root cause analysis on external and internal processes and data to identify opportunities for improvement and answer questions • Excellent analytic skills associated with working on structured and unstructured datasets • Ability to build processes that support data transformation, workload management, data structures, dependency and metadata • Ability to build and optimize data sets, ‘big data’ data pipelines and architectures • Ability to understand and tell the story embedded in the data at the core of our business • Ability to communicate with non-technical audience from a variety of business functions • Strong knowledge of coding standards, best practices and data governance • Demonstrated AI literacy, enabling effective understanding, interaction, and critical evaluation of AI technologies and applications across diverse business functions.
• HMO coverage • Mental health support • Paid time off • 13th-month pay • Company trips • Wellness benefits • Internet Subsidy • Healthcare Coverage (+ 2 dependents) • Maternity, Paternity, and Solo Parent Benefit • Continuous Learning and Professional Development Benefit • Mental Health Support • Company Incentive • Meeting and Team Building Allowance • Shutdown Week • Volunteer Time Off • Bereavement Leave
Apply NowOctober 24
Senior Data Engineer responsible for developing and maintaining data pipelines using Azure services. Involvement in data governance, compliance, and cross-functional collaboration with software engineering teams.
Azure
Cyber Security
ETL
Python
SQL
.NET
October 22
Data Engineer responsible for developing ELT solutions and collaborating with business partners to implement changes remotely. Involves maintaining data applications and analyzing requirements.
Azure
ETL
Informatica
Python
SSIS
October 20
Data Engineer designing and maintaining robust data pipelines at ScalableOS. Collaborate cross-functionally for data integration and modernization within the team.
ETL
PySpark
Python
October 15
Data Warehouse Developer maintaining Reinsurance Datawarehouse Applications for Arch Capital Group. Collaborating with teams to analyze business requirements and implement ETL solutions in a deadline-driven environment.
Azure
ETL
Python
October 10
Data Architect managing SQL data estate and leading data enrichment across Azure platform. Designing scalable pipelines and setting quality standards in data architecture.
Azure
ETL
Pandas
Postgres
PySpark
Python
SQL
Terraform
Vault