
Artificial Intelligence • Biotechnology • SaaS
TetraScience is a company dedicated to transforming raw scientific data into AI-native datasets for advanced scientific applications. By collaborating closely with leading biopharmaceutical companies, TetraScience enhances productivity, accelerates insights, and ensures data integrity across the scientific value chain. Their platform offers solutions for next-generation lab data management, AI-driven scientific outcomes, and compliance with industry standards. As the first company to provide a data and AI cloud built specifically for science, TetraScience enables its clients to liberate, unify, and transform their data, overcoming traditional data silos and boosting scientific productivity by providing a flexible, open, and collaborative infrastructure.
November 3

Artificial Intelligence • Biotechnology • SaaS
TetraScience is a company dedicated to transforming raw scientific data into AI-native datasets for advanced scientific applications. By collaborating closely with leading biopharmaceutical companies, TetraScience enhances productivity, accelerates insights, and ensures data integrity across the scientific value chain. Their platform offers solutions for next-generation lab data management, AI-driven scientific outcomes, and compliance with industry standards. As the first company to provide a data and AI cloud built specifically for science, TetraScience enables its clients to liberate, unify, and transform their data, overcoming traditional data silos and boosting scientific productivity by providing a flexible, open, and collaborative infrastructure.
• - You will be a Senior member of the Scientific Data Engineer team and helping build Tetra Data and productizable solutions, which is the foundation of the Data Engineering layer. • - Work with Product Managers and Solution Architects to understand business requirements, gather insight into potential positive outcomes, recommend potential outcomes, and build a solution based on consensus. • - Take ownership of building data models, prototypes, and integration solutions that drive customer success. • - Research and prototype data integration strategy for scientific lab instrumentation, prototype file parsers for instrument output files (.xlsx, .pdf, .txt, .raw, .fid, and many other vendor binaries). • - Quality gatekeeper: design with quality backed by unit tests, integration tests, and utility functions. • - Lead team-wide process/technology improvements on product quality and developer experience. • - Rally the team to finish Agile Sprint commitments. Actively surfacing team inefficiencies and striving to resolve them. • - Driven by results. Have the pragmatic urgency to resolve blockers, unclear requirements, and make things happen. • - Provide mentorship to junior SDEs and show leadership in every front.
• - 8+ years of building solutions as a Data Engineer or similar fields. • - 8+ years working in Python and SQL with a focus on data. • - 6+ years of experience leading projects, managing requirements, and handling timelines • - 4+ years of experience managing multiple customer-focused implementation projects across cross-functional teams, building sustainable processes, and managing delivery milestones. • - Excellent communication skills, attention to detail, and the confidence to take control of project delivery. • - Quickly understand a highly technical product and effectively communicate with product management and engineering.
• - 100% employer paid benefits for all eligible employees and immediate family members. • - 401K. • - Unlimited paid time off (PTO). • - Flexible working arrangements. • - Company paid Life Insurance, LTD/STD.
Apply NowNovember 1
Data Migration Engineer specializing in Snowflake and dbt for efficient data pipeline development. Collaborating with diverse teams to manage data workflows and integrations in AWS.
Airflow
AWS
ETL
Matillion
Python
SQL
November 1
Senior Data Engineer designing, developing, and implementing data solutions for clients. Collaborating with teams to analyze data requirements and ensure quality project delivery.
🗣️🇫🇷 French Required
Ansible
AWS
Azure
BigQuery
Cloud
ETL
Google Cloud Platform
Hadoop
Kafka
Python
Spark
SQL
Terraform
October 31
Senior Data Engineer at Shopmonkey managing and optimizing data infrastructure. Involved in building and improving data pipelines and tools for internal and external stakeholders.
Airflow
Cloud
Docker
Google Cloud Platform
Kubernetes
Python
SQL
October 31
Senior Data Engineer responsible for building and maintaining data infrastructure at Shopmonkey. Ensuring data flow efficiency and mentoring junior engineers in cloud and orchestration tools.
Airflow
Cloud
Docker
Google Cloud Platform
Kubernetes
Python
SQL
October 31
Azure Data Engineer developing, building, and maintaining data engineering solutions utilizing Microsoft Azure and Fabric services. Collaborating with cross-functional teams in an Agile environment to deliver high-quality solutions.
Apache
Azure
ERP
ETL
NoSQL
Oracle
PySpark
Python
Spark
SQL