
Enterprise • Science • Legal
RELX is a global provider of information-based analytics and decision tools for professional and business customers. The company focuses on enabling its clients to make better decisions, improve results, and enhance productivity by leveraging advanced technology and data. RELX serves various sectors, including Risk, Scientific, Technical & Medical, Legal, and Exhibitions, by offering specialized information and analytical tools that facilitate critical decision-making. The company is committed to corporate responsibility and delivering societal benefit through its products by contributing to scientific advancement, legal justice, and effective market transactions.
November 1
🦌 Connecticut – Remote
🦞 Maine – Remote
+1 more states
💵 $70.2k - $117.1k / year
⏰ Full Time
🟡 Mid-level
🟠 Senior
🚰 Data Engineer

Enterprise • Science • Legal
RELX is a global provider of information-based analytics and decision tools for professional and business customers. The company focuses on enabling its clients to make better decisions, improve results, and enhance productivity by leveraging advanced technology and data. RELX serves various sectors, including Risk, Scientific, Technical & Medical, Legal, and Exhibitions, by offering specialized information and analytical tools that facilitate critical decision-making. The company is committed to corporate responsibility and delivering societal benefit through its products by contributing to scientific advancement, legal justice, and effective market transactions.
• Designing, implementing, and maintaining data pipelines using dbt and Snowflake • Developing and automating Python scripts for data transformation, validation, and delivery • Managing data workflows and deployments across the AWS ecosystem (S3, Lambda, ECS, IAM, etc.) • Collaborating with internal and external teams to deliver efficient, secure data integrations • Troubleshooting and resolving data pipeline or performance issues • Applying best practices for CI/CD, testing, and version control in data workflows • Contributing to ETL orchestration and scheduling using Matillion
• Possess current experience with dbt and Snowflake (required). Please do not apply with this experience. • Experience with Matillion ETL or similar data orchestration tools • Familiarity with Airflow, Dagster, or other workflow orchestration frameworks • Have current and extensive Python development skills for automation and data processing • Possess a solid understanding of AWS services related to data engineering • Experience with SQL, schema design, and performance optimization • Possess familiarity with Git and collaborative development practices
• Health Benefits: Comprehensive, multi-carrier program for medical, dental and vision benefits • Retirement Benefits: 401(k) with match and an Employee Share Purchase Plan • Wellbeing: Wellness platform with incentives, Headspace app subscription, Employee Assistance and Time-off Programs • Short-and-Long Term Disability, Life and Accidental Death Insurance, Critical Illness, and Hospital Indemnity • Family Benefits, including bonding and family care leaves, adoption and surrogacy benefits • Health Savings, Health Care, Dependent Care and Commuter Spending Accounts • In addition to annual Paid Time Off, we offer up to two days of paid leave each to participate in Employee Resource Groups and to volunteer with your charity of choice
Apply NowNovember 1
Senior Data Engineer designing, developing, and implementing data solutions for clients. Collaborating with teams to analyze data requirements and ensure quality project delivery.
🗣️🇫🇷 French Required
Ansible
AWS
Azure
BigQuery
Cloud
ETL
Google Cloud Platform
Hadoop
Kafka
Python
Spark
SQL
Terraform
October 31
Senior Data Engineer responsible for building and maintaining data infrastructure at Shopmonkey. Ensuring data flow efficiency and mentoring junior engineers in cloud and orchestration tools.
Airflow
Cloud
Docker
Google Cloud Platform
Kubernetes
Python
SQL
October 31
Senior Data Engineer at Shopmonkey managing and optimizing data infrastructure. Involved in building and improving data pipelines and tools for internal and external stakeholders.
Airflow
Cloud
Docker
Google Cloud Platform
Kubernetes
Python
SQL
October 31
Azure Data Engineer developing, building, and maintaining data engineering solutions utilizing Microsoft Azure and Fabric services. Collaborating with cross-functional teams in an Agile environment to deliver high-quality solutions.
Apache
Azure
ERP
ETL
NoSQL
Oracle
PySpark
Python
Spark
SQL
October 31
Data Engineer developing solutions for long-term federal government client programs. Collaborating in agile teams to design and develop impactful technology products.
Amazon Redshift
ETL
MS SQL Server
SQL