Data Engineer

Job not on LinkedIn

October 22

Apply Now
Logo of MONQ

MONQ

B2B • SaaS • Artificial Intelligence

<MONQ> MONQ is an AI-driven negotiation platform that helps enterprise procurement teams optimize negotiation strategies and outcomes for buyers and vendors. Launching Q4 2025, the product provides real-time analytics, automated risk assessment and compliance monitoring, and performance insights designed to reduce costs and accelerate deal cycles for large contracts and global procurement operations.

📋 Description

• Design and operate performant, scalable ingestion pipelines processing high-volume contract data, vendor databases, and procurement system integrations from Fortune 500 enterprises • Create sophisticated data parsing systems for complex legal/procurement documents, multi-dimensional deal terms, and procurement workflow data • Build real-time data feeds supporting autonomous negotiation agents and strategic decision-making dashboards • Define, evolve, and manage data schemas for contract intelligence, vendor benchmarking, and negotiation performance analytics • Build comprehensive contract data catalogues ensuring discoverability and lineage tracking for complex procurement datasets • Design data models that support multi-dimensional optimisation across price, terms, risk, timeline, and relationship factors • Build end-to-end monitoring and observability for mission-critical negotiation data pipelines • Collaborate closely with AI, Platform, and Product teams, provisioning datasets, feature tables, and contracts that power autonomous negotiation agents at scale • Create data pipelines that integrate with major procurement systems (SAP, Oracle, Coupa, GEP) and enterprise workflows • Continuously improve efficiency and reliability via comprehensive testing, CI/CD automation, cost/performance tuning for enterprise-scale deployments

🎯 Requirements

• 3+ years building and running data pipelines in AWS or Azure, with experience in enterprise B2B environments and complex data integration challenges • Expert-level programming in Python with focus on production-grade ETL systems; experience with enterprise data at scale • Deep experience with batch and streaming frameworks (Apache Spark, Flink, Kafka Streams, Beam), including orchestration via Airflow, Prefect, or Dagster • Background in legal/procurement document processing, contract analysis, or complex unstructured data parsing (preferred) • Proven track record integrating with major enterprise systems (ERP, procurement platforms, business intelligence tools) (preferred) • Experience building data infrastructure that supports machine learning models and AI agents in production (preferred)

🏖️ Benefits

• Significant equity stake - play a real part in Monq's potential to capture a $4.2T market • Bi-annual performance bonuses tied to successful pilot deployments and customer outcomes • Remote first with quarterly gatherings • Annual team retreat - fully-funded off-site focused on AI innovation and team building • Minimum 30 days of annual leave and a day off in the month of your birthday • Flexible working hours as long as we get the things done • Temporary work from abroad up to 120 days a year

Apply Now

Similar Jobs

October 19

Senior Data Engineer specialized in integrating into teams for large projects in Microsoft Azure environments. 100% remote role focusing on data engineering best practices.

🗣️🇧🇷🇵🇹 Portuguese Required

AWS

Azure

ETL

Google Cloud Platform

SQL

SSIS

October 15

DWH Engineer in data-driven media & advertising industry leveraging cutting-edge technologies for insights. Analyze data, develop processes, and design data warehouse models within a dynamic team.

AWS

Cloud

SQL

October 14

Senior Data Engineer focusing on Business Intelligence at Hitachi Energy's Łódź Technology Center. Designing data solutions, optimizing data processes, and implementing analytics tools.

Azure

ETL

Python

SQL

SSIS

October 10

Senior Data Engineer architecting and building data infrastructure at Rebrandly. Responsible for AWS-based data ecosystems, impacting millions of global users.

Airflow

Amazon Redshift

AWS

DynamoDB

ETL

NoSQL

Python

SQL

Terraform

October 10

Data Architect leading strategic design and implementation of global data model at InPost Group. Focused on building scalable data ecosystems and optimizing performance across various data initiatives.

🗣️🇵🇱 Polish Required

Azure

ETL

Python

Spark

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com