
B2C • Hospitality • Travel
Minor Hotels Europe and Americas is a hospitality company that operates a diverse portfolio of hotels and resorts across Europe and the Americas. The company focuses on delivering exceptional guest experiences through high-quality service and unique accommodations, catering to both leisure and business travelers.
10,000+ employees
Founded 1978
👥 B2C
💰 Post-IPO Equity on 2018-05
August 12
Airflow
Amazon Redshift
Apache
AWS
Azure
BigQuery
Cloud
ETL
Google Cloud Platform
Hadoop
IoT
Kafka
Python
Spark
SQL

B2C • Hospitality • Travel
Minor Hotels Europe and Americas is a hospitality company that operates a diverse portfolio of hotels and resorts across Europe and the Americas. The company focuses on delivering exceptional guest experiences through high-quality service and unique accommodations, catering to both leisure and business travelers.
10,000+ employees
Founded 1978
👥 B2C
💰 Post-IPO Equity on 2018-05
• Designing, developing, and maintaining robust ETL/ELT pipelines for data ingestion and transformation. • Optimizing data storage and retrieval for performance and scalability. • Ensuring data quality, integrity, and security across systems. • Collaborating with data scientists, analysts, and IT teams to support data needs. • Implementing and managing data architecture and infrastructure on cloud or on-prem platforms. • Monitoring and troubleshooting data pipeline issues and system performance. • Maintaining documentation of data flows, schemas, and system configurations. • Supporting real-time data processing and integration with IoT and manufacturing systems.
• Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. • 2–5 years of experience in data engineering or software development, preferably in manufacturing. • Proficiency in SQL, Python, and data pipeline tools (e.g., Apache Airflow, Talend). • Experience with cloud data platforms (e.g., Azure Data Factory, AWS Glue, GCP Dataflow). • Knowledge of data warehousing solutions (e.g., Snowflake, Redshift, BigQuery). • Familiarity with big data technologies (e.g., Hadoop, Spark) and real-time streaming (e.g., Kafka). • Strong understanding of data modeling, database design, and system integration. • Excellent collaboration and communication skills.
• Flexible Work Arrangements—Work remotely or adjust work hours for a balanced lifestyle. • Career Growth Opportunities—Access diverse roles and career development programs. • Training & Certifications—Equip yourself with valuable certifications in emerging technologies.
Apply NowMarch 14
Join our dynamic team as an AWS Data Engineer, designing and maintaining data infrastructure.