Data Platform Architect

Job not on LinkedIn

September 3

Apply Now
Logo of ClearOps

ClearOps

Agriculture • Automotive • Construction & Mining

ClearOps is an innovative aftersales platform that connects manufacturers, dealers, and end customers to enhance collaboration and streamline services. The platform provides comprehensive visibility across the entire supply chain, allowing for real-time management of parts, service operations, and customer interactions. ClearOps is designed to optimize dealer network performance through improved inventory management, demand planning, and automated workflows, ultimately reducing downtime and boosting productivity in various industries such as agriculture and automotive.

51 - 200 employees

Founded 2020

🌾 Agriculture

💰 $70k Pre Seed Round on 2018-02

📋 Description

• Design and maintain the overall data architecture, ensuring scalability, security, and performance • Define and implement data models and organize how data is stored, integrated, and accessed • Select and implement technologies within AWS and the broader data ecosystem to build scalable solutions • Build and orchestrate batch and real-time data pipelines, integrating multiple internal and external sources • Ensure reliable data availability through effective ETL/ELT processes and backup strategies • Establish governance frameworks and standards for data quality, metadata, lifecycle management, and compliance • Standardize data formats and access controls across the company to ensure consistency and security • Drive the creation of data products, analytics models, dashboards, and curated datasets • Collaborate with engineers, analysts, and product teams to transform raw data into actionable insights • Provide best practices and mentorship to enable teams to build on the platform effectively • Work closely with stakeholders to align data architecture with business needs and technical requirements • Continuously evaluate and adopt new technologies to keep the platform modern and efficient

🎯 Requirements

• Proven experience in designing and building large-scale data platforms • Strong knowledge of data modeling, including dimensional modeling, star and snowflake schemas • Expertise with modern data lakes, data warehouses, and lakehouse architectures using AWS services such as S3, Redshift, and Glue • Hands-on experience designing and optimizing ETL/ELT pipelines with Python-based frameworks and AWS tools (PySpark on EMR, AWS Glue, AWS Data Wrangler, AWS Lambda, Step Functions) • Experience with distributed data processing on AWS EMR and orchestration tools like Apache Airflow or dbt • Familiarity with change-data-capture (CDC) on Kafka for near-real-time data pipelines • Strong background with SQL and NoSQL systems such as PostgreSQL, MySQL, Redshift, and cloud storage data lakes • Experience building or migrating data warehouse and lake architectures with Redshift, Databricks, Snowflake, or S3 • Knowledge of AWS security best practices, networking, and scalable cloud deployments • Practical experience with infrastructure as code (Terraform, CloudFormation), containerization (Docker), and orchestration (Kubernetes on EKS) • Ability to design serverless data pipeline components (e.g., Lambda) • Strong communication skills to explain complex data concepts and align cross-functional teams • Nice to have: Familiarity with supply chain data and processes such as logistics, inventory, and supplier data • Nice to have: Domain expertise in the supply chain industry to maximize impact of data architecture

🏖️ Benefits

• High Impact : Shape the core of our data platform and influence product direction • Growth Opportunities : Develop into a strategic leadership role as the platform and company scale • Supportive Culture : Join a collaborative team where your expertise drives real change • Learning & Development : Access to mentors, continuous training, and professional growth • Flexibility : Flexible working hours, mobile work, and workcation opportunities

Apply Now

Similar Jobs

September 3

Devoteam

5001 - 10000

🤖 Artificial Intelligence

🔒 Cybersecurity

Tech Lead Java building Java/Spring REST APIs and microservices at Devoteam, an EMEA digital transformation consultancy. Leading teams and ensuring code quality.

🇵🇹 Portugal – Remote

⏰ Full Time

🟠 Senior

🔙 Backend Engineer

September 2

Ethena Labs

2 - 10

💸 Finance

💳 Fintech

🌐 Web 3

Senior backend engineer at Ethena Labs building and scaling backend architecture, APIs, and integrations for crypto-native dollar and DeFi infrastructure.

🇵🇹 Portugal – Remote

⏰ Full Time

🟠 Senior

🔙 Backend Engineer

August 29

PandaDoc

501 - 1000

☁️ SaaS

🤝 B2B

⚡ Productivity

Python Engineer at PandaDoc; building document generation features for thousands of users; remote across Europe.

🇵🇹 Portugal – Remote

💰 Series C on 2021-09

⏰ Full Time

🟡 Mid-level

🟠 Senior

🔙 Backend Engineer

August 27

Imaginary Cloud

51 - 200

☁️ SaaS

🤝 B2B

Senior Fullstack Developer building web and mobile products at Imaginary Cloud. Work remotely on global client projects using React, Ruby on Rails, Python and modern toolchains.

🇵🇹 Portugal – Remote

💵 €39.7k - €47.7k / year

⏰ Full Time

🟠 Senior

🔙 Backend Engineer

August 20

PandaDoc

501 - 1000

☁️ SaaS

🤝 B2B

⚡ Productivity

Senior Python Engineer for PandaDoc's Application Platform; develops scalable services, collaborates with product teams, and drives architecture decisions.

🇵🇹 Portugal – Remote

💰 Series C on 2021-09

⏰ Full Time

🟠 Senior

🔙 Backend Engineer

Developed by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com