
Artificial Intelligence • SaaS • Enterprise
Proximity Works is a global tech company specializing in Artificial Intelligence and SaaS solutions. They focus on building high-performance software that leverages cutting-edge AI technologies to enhance productivity and transform business operations. Their services include AI-powered SaaS solutions, UX design, and product engineering to help companies scale effectively. Proximity Works collaborates with industry giants like Nasdaq, Disney, and ESPN, offering expertise in AI strategy, machine learning, and high-caliber product development.
51 - 200 employees
🤖 Artificial Intelligence
☁️ SaaS
🏢 Enterprise
October 27

Artificial Intelligence • SaaS • Enterprise
Proximity Works is a global tech company specializing in Artificial Intelligence and SaaS solutions. They focus on building high-performance software that leverages cutting-edge AI technologies to enhance productivity and transform business operations. Their services include AI-powered SaaS solutions, UX design, and product engineering to help companies scale effectively. Proximity Works collaborates with industry giants like Nasdaq, Disney, and ESPN, offering expertise in AI strategy, machine learning, and high-caliber product development.
51 - 200 employees
🤖 Artificial Intelligence
☁️ SaaS
🏢 Enterprise
• - Design, develop, and maintain large-scale data pipelines to support Ads reporting, attribution, and analytics use cases. • - Work extensively with Hive, Spark, SQL, Scala, and Kafka to process and manage petabyte-scale datasets. • - Optimize data workflows for performance, scalability, and cost efficiency. • - Partner with data scientists, product managers, and platform engineers to deliver high-quality, reliable datasets and APIs. • - Ensure data quality, integrity, and consistency across multiple data sources.**Troubleshoot and resolve issues in real-time streaming pipelines and batch data jobs. • - Continuously evaluate new technologies to enhance the Ads Data platform.
• - Strong programming experience in Scala (preferred), Java, or Python. • - Hands-on experience with Apache Spark (batch & streaming) for large-scale data processing. • - Proficiency in Hive, SQL, and data modeling for analytical workloads. • - Experience working with Kafka for real-time event streaming. • - Solid understanding of Big Data ecosystems (S4, Hive, Presto, Delta etc.). • - Strong debugging, performance tuning, and problem-solving skills. • - Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. • Nice to Have • - Experience in AdTech, Attribution, or Campaign Analytics. • - Familiarity with cloud-based big data solutions (AWS EMR, GCP BigQuery, Databricks, etc.). • - Familiarity with scheduling services like AirFlow • - Knowledge of data governance, security, and compliance best practices.
• - **Best in class salary**: We hire only the best, and we pay accordingly. • - **Proximity Talks**: Meet other designers, engineers, and product geeks — and learn from experts in the field. • - **Keep on learning with a world-class team**: Work with the best in the field, challenge yourself constantly, and learn something new every day.
Apply NowOctober 27
Senior Software Engineer developing HighLevel's Desktop App within a remote-first global team. Engaging in both frontend and backend development to enhance user experience across platforms.
October 27
Senior Software Engineer developing SaaS based web applications for Granicus, collaborating on innovative solutions and mentoring team members. Expertise in .NET, React, and Azure services required.
October 27
Software Engineer responsible for developing SaaS solutions and collaborating across teams at Granicus. Engaging in code production and mentorship to enhance technology for government services.
October 27
Software Engineer developing .NET applications for Granicus. Collaborating with teams to deliver high-quality software in a supportive remote environment.
October 25
Software Engineer at Altisource developing and maintaining automated test scripts using Selenium with Java. Involves collaborating with development teams and integrating tests into CI/CD pipelines.