
SaaS • B2B
Jobber is a SaaS platform that helps home-service businesses (such as HVAC, plumbing, landscaping, cleaning, and other field service trades) manage quoting, scheduling, dispatch, invoicing, payments, customer communications, and operations from a single app. It offers CRM, automated workflows, AI features for job pricing and communications, online booking and payments, and reporting to help small and midsize service contractors save time, win more jobs, and grow revenue.
501 - 1000 employees
Founded 2011
☁️ SaaS
🤝 B2B
November 25

SaaS • B2B
Jobber is a SaaS platform that helps home-service businesses (such as HVAC, plumbing, landscaping, cleaning, and other field service trades) manage quoting, scheduling, dispatch, invoicing, payments, customer communications, and operations from a single app. It offers CRM, automated workflows, AI features for job pricing and communications, online booking and payments, and reporting to help small and midsize service contractors save time, win more jobs, and grow revenue.
501 - 1000 employees
Founded 2011
☁️ SaaS
🤝 B2B
• Build scalable data solutions: Design, develop, and maintain batch and real-time data pipelines within cloud infrastructure (preferably AWS) • Empower internal teams: Develop tools and frameworks that automate manual processes, set up alerting/monitoring systems, and help teams run data-driven experiments and analyze results • Accelerate business growth: Collaborate with data analysts, scientists, and product teams to extract actionable insights from data • Strategic planning and innovation: Lead initiatives to research and propose new technologies and tooling for our data stack • Data integrity: Own the integrity of our data and maintain a high level of trust across the organization
• Strong coding skills in Python and SQL • Expertise in building and maintaining ETL pipelines using tools like Airflow and dbt • Experience working with AWS data infrastructure, particularly Redshift, Glue, Lambda, and ECS Fargate • Familiarity with handling large datasets using tools like Spark or similar (e.g., Trino) • Experience with Terraform for infrastructure management • Experience with dimensional modelling, star schemas, and data warehousing in a cloud environment (preferably AWS Redshift) • Knowledge of CI/CD processes, data ingestion, and optimizing data flow across systems • Proficient in working with high-volume, scalable data infrastructure
• Health insurance • Retirement savings matching • Extended health package with fully paid premiums for body and mind • Stipends for health and wellness • Equity rewards • Talent development program that includes career coaching and opportunities for career development • Support for vacations and health days
Apply NowNovember 22
Senior Data Engineer designing and implementing data warehouses and pipelines for Leap Tools. Collaborating with engineering, ML, and product teams on data strategy.
November 20
201 - 500
Data Engineer responsible for building robust data pipelines and analytics systems for Hopper’s advertising business. Collaborate with engineering teams to ensure data integrity and enable insights.
November 18
51 - 200
Senior Manager leading Data Engineering at Docker, driving innovative data analytics and infrastructure. Collaborating across teams to enable data-driven decision-making and support product development.
🇨🇦 Canada – Remote
💵 $231.6k - $318.5k / year
💰 $105M Series C on 2022-03
⏰ Full Time
🟠 Senior
🚰 Data Engineer
November 15
Data Engineer designing and optimizing data pipelines and warehouses leveraging Snowflake & DBT. Permanent, full-time remote position in Canada for a dynamic tech environment.
🗣️🇫🇷 French Required
November 14
Data Architect supporting clients with data architecture and analytics solutions at 3Pillar. Collaborating with business leaders to derive value from their data and implement architecture roadmaps.