
SaaS • Cloud
Rackspace Technology is a leading provider of managed cloud services, offering a comprehensive range of cloud and application solutions. The company specializes in helping businesses increase efficiency and reliability through cloud adoption, application modernization, and data solutions, leveraging technologies such as AI, machine learning, and next-gen data platforms. They offer a full suite of services including consulting, application modernization, cloud security, and multicloud strategies, tailoring solutions to meet the specific needs of sectors such as healthcare, financial services, and public utilities. Rackspace Technology is known for its expertise in navigating complex cloud environments and providing advanced managed services to optimize performance and ensure compliance.
November 19
Airflow
Apache
Cloud
Distributed Systems
Google Cloud Platform
Hadoop
HBase
Java
MapReduce
Python
Redis
Spark
SQL
Terraform

SaaS • Cloud
Rackspace Technology is a leading provider of managed cloud services, offering a comprehensive range of cloud and application solutions. The company specializes in helping businesses increase efficiency and reliability through cloud adoption, application modernization, and data solutions, leveraging technologies such as AI, machine learning, and next-gen data platforms. They offer a full suite of services including consulting, application modernization, cloud security, and multicloud strategies, tailoring solutions to meet the specific needs of sectors such as healthcare, financial services, and public utilities. Rackspace Technology is known for its expertise in navigating complex cloud environments and providing advanced managed services to optimize performance and ensure compliance.
• Design and develop scalable batch processing systems using technologies like Hadoop, Oozie, Pig, Hive, MapReduce, and HBase, with hands-on coding in Java or Python (Java is a must). • Must be able to lead Jira Epics • Write clean, efficient, and production-ready code with a strong focus on data structures and algorithmic problem-solving applied to real-world data engineering tasks. • Develop, manage, and optimize complex data workflows within the Apache Hadoop ecosystem, with a strong focus on Oozie orchestration and job scheduling. • Leverage Google Cloud Platform (GCP) tools such as Dataproc, GCS, and Composer to build scalable and cloud-native big data solutions. • Implement DevOps and automation best practices, including CI/CD pipelines, infrastructure as code (IaC), and performance tuning across distributed systems. • Collaborate with cross-functional teams to ensure data pipeline reliability, code quality, and operational excellence in a remote-first environment.
• Bachelor's degree in Computer Science, software engineering or related field of study. • Experience with managed cloud services and understanding of cloud-based batch processing systems are critical. • Must be able to lead Jira Epics is MUST • Proficiency in Oozie, Airflow, Map Reduce, Java are MUST haves. • Strong programming skills with Java (specifically Spark), Python, Pig, and SQL. • Expertise in public cloud services, particularly in GCP. • Proficiency in the Apache Hadoop ecosystem with Oozie, Pig, Hive, Map Reduce. • Familiarity with BigTable and Redis. • Experienced in Infrastructure and Applied DevOps principles in daily work. Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes. • Proven experience in engineering batch processing systems at scale. • 5+ years of experience in customer-facing software/technology or consulting. • 5+ years of experience with “on-premises to cloud” migrations or IT transformations. • 5+ years of experience building, and operating solutions built on GCP • Proficiency in Oozie andPig • Must be able to lead Jira Epics • Proficiency in Java or Python
• The role may include variable compensation in the form of bonus, commissions, or other discretionary payments. These discretionary payments are based on company and/or individual performance and may change at any time.
Apply NowNovember 19
Data Engineer at Mixpanel responsible for data and software pipelines. Collaborating with Data Science, Product, and Finance teams to ensure high-quality data sources.
🇺🇸 United States – Remote
💵 $174.6k - $213.4k / year
⏰ Full Time
🟠 Senior
🚰 Data Engineer
🦅 H1B Visa Sponsor
Airflow
BigQuery
Java
Python
SQL
November 18
Senior Data Engineer I managing PostgreSQL databases and data integrations at OppFi. Ensuring data quality with a focus on performance and collaboration with teams.
🇺🇸 United States – Remote
💵 $123.2k - $184.8k / year
💰 $250M Post-IPO Debt on 2023-07
⏰ Full Time
🟠 Senior
🚰 Data Engineer
🦅 H1B Visa Sponsor
Airflow
Apache
Postgres
Python
SQL
November 18
Senior Data Engineer shaping data-driven operations at Agility Robotics. Collaborate with teams to design and maintain datasets for analysis and debugging.
Airflow
Apache
AWS
Cloud
ETL
Java
Python
Scala
Spark
November 18
Senior Data Engineer developing data infrastructure for Vanna Health's community-based care for mental illness. Collaborating with teams to optimize clinical and financial outcomes.
BigQuery
Cloud
MySQL
Postgres
Python
Spark
SQL
November 18
AI Data Engineer responsible for architecting and operating real-time ETL pipelines for AI/ML applications in autonomous multi-agent systems. Contribute to team processes, documentation, and operational quality.
AWS
Cloud
ETL
Kafka
Python