
B2B • Recruitment • SaaS
RemoteStar is a global recruitment service that specializes in hiring top-quality tech talent. By assembling diverse teams with vetted developers from various regions, RemoteStar ensures high-quality staffing while maximizing cost efficiency for companies. The service includes a rigorous vetting process, technical matching, and full onboarding support, allowing businesses to focus on their core operations while RemoteStar handles the administrative aspects of recruitment and team management.
11 - 50 employees
Founded 2020
🤝 B2B
🎯 Recruiter
☁️ SaaS
July 17

B2B • Recruitment • SaaS
RemoteStar is a global recruitment service that specializes in hiring top-quality tech talent. By assembling diverse teams with vetted developers from various regions, RemoteStar ensures high-quality staffing while maximizing cost efficiency for companies. The service includes a rigorous vetting process, technical matching, and full onboarding support, allowing businesses to focus on their core operations while RemoteStar handles the administrative aspects of recruitment and team management.
11 - 50 employees
Founded 2020
🤝 B2B
🎯 Recruiter
☁️ SaaS
• Design and build scalable data pipelines for extraction, transformation, and loading (ETL) using the latest Big Data technologies. • Identify and implement internal process improvements like automating manual tasks and optimizing data flows for better performance and scalability. • Partner with Product, Data, and Engineering teams to address data-related technical issues and infrastructure needs. • Collaborate with machine learning and analytics experts to support advanced data use cases.
• Bachelor’s degree in Engineering, Computer Science, or a relevant technical field. • 10+ years of recent experience in Data Engineering roles. • Minimum 5 years of hands-on experience with Apache Spark, with strong understanding of Spark internals. • Deep knowledge of Big Data concepts and distributed systems. • Proficiency in coding with Scala, Python, or Java, with flexibility to switch languages when required. • Expertise in SQL, and hands-on experience with PostgreSQL, MySQL, or similar relational databases. • Strong cloud experience with Databricks, including Delta Lake. • Experience working with data formats like Delta Tables, Parquet, CSV, JSON. • Comfortable working in Linux environments and scripting. • Comfortable working in an Agile environment. • Machine Learning knowledge is a plus. • Must be capable of working independently and delivering stable, efficient and reliable software. • Experience supporting and working with cross-functional teams in a dynamic environment.
Apply NowJuly 15
The Data Engineer manages Databricks and various AWS tools. Remote role with operational support duties.
July 15
Join V4C.ai as a Data Architect, specializing in Databricks for scalable analytics and ML.
July 9
Data Engineer specializing in Fraud Detection and Financial Crime Analytics for financial services. Responsible for designing real-time data pipelines and optimizing fraud detection models.
July 4
Join e.l.f. Beauty as a Senior Data Architect to lead data architecture initiatives in a high-growth team.
July 4
Join e.l.f. Beauty as a Data Engineer to design and maintain data pipelines and infrastructure.
🇮🇳 India – Remote
💵 ₹3.5M - ₹4.5M / year
💰 $225.2M Post-IPO Secondary on 2017-03
⏰ Full Time
🟠 Senior
🔴 Lead
🚰 Data Engineer