
B2B • eCommerce • Marketing
Influur is an innovative platform designed for brands and influencers to connect seamlessly. It enables brand managers, marketing agencies, and influencer managers to easily build and implement influencer marketing strategies using cutting-edge technology. With access to over 30,000 influencers globally and advanced data integration, Influur simplifies the entire process, from campaign launch to secure payments, all while providing transparent pricing.
51 - 200 employees
Founded 2020
🤝 B2B
🛍️ eCommerce
💰 $5M Seed Round on 2022-04
November 17

B2B • eCommerce • Marketing
Influur is an innovative platform designed for brands and influencers to connect seamlessly. It enables brand managers, marketing agencies, and influencer managers to easily build and implement influencer marketing strategies using cutting-edge technology. With access to over 30,000 influencers globally and advanced data integration, Influur simplifies the entire process, from campaign launch to secure payments, all while providing transparent pricing.
51 - 200 employees
Founded 2020
🤝 B2B
🛍️ eCommerce
💰 $5M Seed Round on 2022-04
• Influur is redefining how advertising works, through creators, data, and AI. • Our mission is to make influencer marketing as measurable, predictable, and scalable as paid ads, and we're building the tech that powers it. • Backed by top-tier investors and trusted by global brands, we're scaling fast across music and culture. • Young talent ready to go all in. We're offering significant equity to people who want to build something that matters. This isn't a job. It's an opportunity to define the future of AI in influencer marketing.
• Strong programming with Python and SQL. • Comfortable building from scratch and improving existing code. • Expertise in data modeling and warehousing, including dimensional modeling and performance tuning. • Experience designing and operating ETL and ELT pipelines with tools like Airflow or Dagster, plus dbt for transformations. • Hands-on with batch and streaming systems and with Lakehouse or warehouse tech on AWS or GCP. • Proficiency integrating third-party APIs and datasets, ensuring reliability, lineage, and governance. • Familiarity with AI data needs: feature stores, embedding pipelines, vector databases, and feedback loops that close the gap between model and outcome. • High standards for code quality, testing, observability, and CI. • Comfortable with Docker and modern cloud infra.
• Competitive equity in a venture-backed company. • Opportunities to grow and develop. • Remote work.
Apply NowNovember 17
Data Engineer focusing on evolving data infrastructure for Technosylva’s data platform. Building and maintaining data pipelines; collaborating with Science teams and adopting DevOps culture.
🗣️🇪🇸 Spanish Required
November 14
Senior Data Engineer designing and optimizing scalable data pipelines for machine learning solutions at Cotiviti. Collaborating across teams to deliver high-quality data products.
🗣️🇪🇸 Spanish Required
November 7
Design and automate essential data pipelines ensuring seamless integration for Nielsen's analytics. Collaborate with teams and uphold data integrity throughout its lifecycle.
November 6
GCP Data Engineer in remote role for a leading data solutions company. Responsible for implementing data architectures and ETL processes with a focus on Google Cloud Platform.
🗣️🇪🇸 Spanish Required
November 4
Data Engineer at Inetum designing and operating large-scale petabyte data systems. Building real-time data pipelines and collaborating in agile environments to deliver scalable solutions.
🗣️🇪🇸 Spanish Required