Data Engineer

October 22

Apply Now
Logo of Beam Impact

Beam Impact

eCommerce • B2B • Marketing

Beam Impact is a company that provides an end-to-end marketing engine designed to help brands acquire and retain customers through shared values, rather than relying solely on discounts. By utilizing gamification centered around brand values, Beam Impact enables companies to increase their revenue, build communities, and preserve margins. Their customizable platform allows businesses to showcase their commitment to various causes by featuring vetted nonprofits, personalized impact tracking, and gamified updates. This approach not only boosts customer conversion rates and retention but also strengthens the emotional connection between brands and their consumers by allowing customers to visualize the real-world impact of their purchases. Beam Impact offers solutions for subscription brands and integrates seamlessly with popular eCommerce platforms, supported by third-party compliance and custom ROI reports.

11 - 50 employees

🛍️ eCommerce

🤝 B2B

📋 Description

• Design and plan new platform data solutions through collaboration with our Staff Data Engineer, product team, and internal stakeholders using RFC and Tech Spec documents • Collaborate with your peers to execute and implement high-quality software • Collaborate with your peers to assist in planning and execution of new features, bringing a focus on data platform requirements • Drive the execution of your work from spec document to delivery, with assistance from our Staff Data Engineer and other peers • Grow your skillset and be a force multiplier by sharing knowledge through code reviews, pair programming, 1-1 conversations, and broader team trainings • Develop subject matter expertise in the Beam data ecosystem and our reporting platform through code development, directly supporting our Client Strategy team, Business Operations, and our partners • Understand and strive to balance tech debt with practicality while designing solutions • Gain knowledge of the nonprofit giving and e-commerce enablement spaces • Foster a team culture around Beam’s values of community, inclusivity, care, accountability and support • Strive for continuous improvement through goal setting, feedback, and other growth opportunities

🎯 Requirements

• 2+ years utilizing Python and SQL to develop data intensive applications or insights, or 3+ years of experience as a Software Engineer using Python and SQL, working in a primarily Python codebase • Experience working on production-level data systems in a collaborative team environment • Proficient with Python and SQL • Proficient with relational databases, particularly PostgreSQL • Familiar with fundamental development principles and processes such as observability, performance optimization, continuous integration, automated testing, and cloud infrastructure • Familiar with distributed computing architecture, data warehouse and data lake storage patterns • Strong communication skills with the ability to explain and advocate for technical projects to non-technical roles

🏖️ Benefits

• Up to 10% annual bonus based entirely on individual performance • Stock options • 50% match on 401k contributions of up to 6% of base salary • $100 monthly wellness stipend • $750 annual professional development budget

Apply Now

Similar Jobs

October 22

Data Engineer maintaining ETL projects at leading cardiac data management company. Collaborating with data, engineering, and clinical teams to build scalable data workflows.

Airflow

BigQuery

Cloud

Docker

ETL

Python

SQL

Tableau

October 21

Innovative engineering leader building and leading Expense Optimization Data Engineering team at Netflix. Guiding data engineers to create models for financial and cost management insights.

Spark

October 21

Junior Data Engineer contributing to healthcare data solutions at Cylinder. Building data pipelines and collaborating with senior team members in a supportive environment.

Airflow

Cloud

Distributed Systems

Python

SQL

October 21

Data Engineer Manager leading healthcare data platform development at Superlanet. Primarily remote role for Texas-based resources with hands-on engineering and architecture in Microsoft Azure.

Apache

Azure

Cloud

ETL

Kafka

PySpark

Scala

Spark

SQL

Unity

October 20

Data Engineer modernizing public health data architecture by building data pipelines and ensuring compliance. Collaborating with various teams to improve public health outcomes.

Cloud

ElasticSearch

ETL

Kafka

Python

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com