Realize the full value of the cloud.
IT as a Service • Multi-Cloud • Managed Hosting • Managed AWS/Azure/Google Cloud Platform/OpenStack/Alibaba • Managed Private Cloud for VMware/Microsoft/OpenStack
5001 - 10000
May 4
Realize the full value of the cloud.
IT as a Service • Multi-Cloud • Managed Hosting • Managed AWS/Azure/Google Cloud Platform/OpenStack/Alibaba • Managed Private Cloud for VMware/Microsoft/OpenStack
5001 - 10000
• We are seeking a highly skilled and experienced Senior Big Data Engineer to join our dynamic team. • Strong background in developing batch processing systems, with extensive experience in Oozie, the Apache Hadoop ecosystem, Airflow, and a solid understanding of public cloud technologies, especially GCP. • Role involves working in a remote environment, requiring excellent communication skills and the ability to solve complex problems independently and creatively.
• Experience with GCP managed services and understanding of cloud-based batch processing systems are critical. • Proficiency in Oozie, Airflow, Map Reduce, Java • Strong programming skills with Java (specifically Spark), Python, Pig, and SQL • Expertise in public cloud services, particularly in GCP. • Proficiency in the Apache Hadoop ecosystem with Oozie, Pig, Hive, Map Reduce • Familiarity with BigTable and Redis • Experienced in Infrastructure and Applied DevOps principles in daily work. Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes. • Ability to tackle complex challenges and devise effective solutions. Use critical thinking to approach problems from various angles and propose innovative solutions. • Worked effectively in a remote setting, maintaining strong written and verbal communication skills. Collaborate with team members and stakeholders, ensuring clear understanding of technical requirements and project goals. • Proven experience in engineering batch processing systems at scale. • Hands-on experience in public cloud platforms, particularly GCP. Additional experience with other cloud technologies is advantageous.
• Develop scalable and robust code for batch processing systems. This includes working with technologies like Hadoop, Oozie, Pig, Hive, Map Reduce, Spark (Java), Python, Hbase • Develop, Manage and optimize data workflows using Oozie and Airflow within the Apache Hadoop ecosystem • Leverage GCP for scalable big data processing and storage solutions • Implementing automation/DevOps best practices for CI/CD, IaC, etc.
Apply NowApril 16
11 - 50
February 3
501 - 1000
🇨🇴 Colombia – Remote
💰 $80M Private Equity Round on 2018-09
⏰ Full Time
🟡 Mid-level
🟠 Senior
🚰 Data Engineer
April 1, 2023
501 - 1000