Data Engineer

October 25

Apply Now
Logo of CaptivateIQ

CaptivateIQ

Finance • Software as a Service (SaaS) • Enterprise

CaptivateIQ is a modern sales commission solution that helps businesses to significantly improve revenue performance. By automating and simplifying the entire commission process, it allows sales teams to focus on selling rather than administrative tasks. The platform offers tools for real-time visibility and analytics, which motivate sales representatives and offer insights to sales leaders. CaptivateIQ's solution is flexible, transparent, and designed to scale with growing businesses, offering features like AI-powered intelligence and SmartGrid™ technology. Notably, it caters to various industries such as financial services, manufacturing, media and entertainment, among others, providing a strategic advantage in managing and optimizing sales performance management initiatives.

201 - 500 employees

Founded 2017

💸 Finance

🏢 Enterprise

💰 $100M Series C on 2022-01

📋 Description

• Design and implement customer data integrations using IPaaS platforms like Workato, Boomi, or MuleSoft. • Write and optimize advanced SQL (Snowflake) — including DDL, DML, CTEs, UDFs, and JSON transformations — to power analytics and automation. • Build and maintain software for orchestration of customer data integrations utilizing web APIs and IPaaS connectors to seamlessly exchange data between CaptivateIQ and our customers. • Ensure customer integrations are secure, testable, fault-tolerant, and designed for long-term maintainability. • Collaborate with Sales Engineering and Product during discovery and scoping to advise on technical feasibility and data architecture approaches. • Build relationships with product engineering to ensure scalability and management of our customer data integrations as customer usage grows. • Translate customer business needs into the technical requirements and solution.

🎯 Requirements

• A strong (4+ years of experience) background in software and/or data engineering • Proven experience working with data pipelines at scale and knowledge of data engineering technologies, such as Snowflake • Strong proficiency in Python and SQL proficiency • Collaborative, humble, a learner with a strong growth mindset • You’ve previously operated in a customer facing role. • Experience configuring and troubleshooting SSO (SAML, OIDC) and SCIM provisioning across Okta, Entra (Azure AD), and OneLogin environments. • Hands-on experience with setting up modern data stacks (ETL, data warehouses, DBT, reverse ETL)

🏖️ Benefits

• Participate in an on-call rotation to provide after-hours support, ensuring timely resolution of critical issues and maintaining system uptime.

Apply Now

Similar Jobs

October 23

DWH Engineer analyzing and interpreting large datasets in media & advertising. Collaborating with a dynamic team and leveraging cutting-edge technologies for insights.

AWS

Cloud

SQL

October 22

Data Engineer designing and maintaining data pipelines for AI-powered negotiation systems at Monq. Collaborating with AI and product teams to enhance procurement operations.

Airflow

Apache

AWS

Azure

ERP

ETL

Kafka

Oracle

Python

Spark

October 19

Senior Data Engineer specialized in integrating into teams for large projects in Microsoft Azure environments. 100% remote role focusing on data engineering best practices.

🗣️🇧🇷🇵🇹 Portuguese Required

AWS

Azure

ETL

Google Cloud Platform

SQL

SSIS

October 15

DWH Engineer in data-driven media & advertising industry leveraging cutting-edge technologies for insights. Analyze data, develop processes, and design data warehouse models within a dynamic team.

AWS

Cloud

SQL

October 14

Senior Data Engineer focusing on Business Intelligence at Hitachi Energy's Łódź Technology Center. Designing data solutions, optimizing data processes, and implementing analytics tools.

Azure

ETL

Python

SQL

SSIS

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com