
Media • Data Analytics • Marketing
Nielsen is a global measurement and data analytics company that provides audience measurement and media metrics solutions across various platforms. They offer cross-media measurement services spanning TV, digital, streaming, and audio for more precise media planning and marketing optimization. Nielsen's flagship product, Nielsen ONE, integrates these services to deliver a comprehensive view of audience habits and preferences. Additionally, they offer insights and tools for understanding consumer behavior and optimizing marketing strategies, backed by a robust data infrastructure including big data and panel methodologies. Nielsen also champions diversity and inclusion, emphasizing a culture supported by a diverse workforce and numerous business resource groups.
November 7

Media • Data Analytics • Marketing
Nielsen is a global measurement and data analytics company that provides audience measurement and media metrics solutions across various platforms. They offer cross-media measurement services spanning TV, digital, streaming, and audio for more precise media planning and marketing optimization. Nielsen's flagship product, Nielsen ONE, integrates these services to deliver a comprehensive view of audience habits and preferences. Additionally, they offer insights and tools for understanding consumer behavior and optimizing marketing strategies, backed by a robust data infrastructure including big data and panel methodologies. Nielsen also champions diversity and inclusion, emphasizing a culture supported by a diverse workforce and numerous business resource groups.
• Design and automate essential data pipelines and inputs, ensuring seamless integration with downstream analytics and production systems. • Collaborate with cross-functional teams to integrate new functionalities into existing data pipelines, including lower test environments to help validate and assess impact prior to Production integration, where applicable. • Implement data governance and quality processes to ensure the integrity and accuracy of data throughout its lifecycle. • Monitor data systems and processes to identify issues and proactively implement improvements to prevent future problems. • Participate in code reviews with senior developers prior to pushing the code into production to ensure meeting accuracy and best practice standards. • Perform unit testing using test cases and fix any bugs. • Optimize code to meet product SLAs • Support multiple projects and communicate with stakeholders in various organizations. This includes regularly providing status updates, developing timelines, providing insights, etc.
• Bachelor’s Degree in Computer Science, Data Science, Analytics or related field • Excellent written and verbal communication skills in English are required • 3-5 years of experience with the following: • Coding in Python, PySpark, and SQL • Working within cloud-based infrastructures and tools such as AWS, EC2, GitLab, and Airflow. • Working within the Software Development Life Cycle framework and applying software development best practices • Building monitoring checks and tools to ensure infrastructure and related processes are working as expected • Solid understanding of system design, data structures and performance optimization techniques • Excellent problem solving skills and attention to detail • Well-organized and able to handle and prioritize multiple assignments • Able to communicate effectively both orally and in writing • (Preferred) 2+ years experience with visualization and reporting tools, e.g. Tableau • (Preferred) Experience working with Jira, Confluence, and Smartsheets • (Preferred) Experience with Alteryx, Databricks platforms
• Employees can work remotely
Apply NowNovember 6
GCP Data Engineer in remote role for a leading data solutions company. Responsible for implementing data architectures and ETL processes with a focus on Google Cloud Platform.
🗣️🇪🇸 Spanish Required
Apache
BigQuery
Cloud
ETL
Google Cloud Platform
PySpark
Python
Spark
SQL
November 4
Data Engineer developing and maintaining data pipelines for a global agile consultancy. Utilizing Modern Data Stack with expertise in Snowflake and Azure Data Factory.
🇲🇽 Mexico – Remote
💵 $50k - $65k / month
💰 Post-IPO Equity on 2007-03
⏰ Full Time
🟡 Mid-level
🟠 Senior
🚰 Data Engineer
🗣️🇪🇸 Spanish Required
Azure
ETL
ITSM
Python
SDLC
ServiceNow
SQL
November 4
Data Engineer at Inetum designing and operating large-scale petabyte data systems. Building real-time data pipelines and collaborating in agile environments to deliver scalable solutions.
🗣️🇪🇸 Spanish Required
Airflow
AWS
Azure
Cassandra
Google Cloud Platform
Hadoop
HBase
Java
Kafka
Oracle
Python
Spark
SQL
November 1
Data Engineer focusing on large-scale data systems operations and real-time data pipelines at Inetum. Collaborating with engineers and product managers to build robust technical solutions.
🗣️🇪🇸 Spanish Required
Airflow
Azure
Cassandra
Cloud
Distributed Systems
Google Cloud Platform
Hadoop
HBase
Java
Kafka
Oracle
Python
Spark
SQL
October 31
Data Engineer role supporting a leading US insurance provider in optimizing data architecture and reporting analysis. Seeking candidates with strong technical skills in ETL and visualization.
ETL
ITSM
Python
SQL