
B2B • SaaS • Recruitment
Smart Working is a recruitment service specializing in sourcing and providing top-tier software developers from around the world to meet the needs of businesses. With a robust vetting process that includes technical assessments and background checks, Smart Working ensures that clients receive highly skilled developers adept in various programming languages and frameworks. The company focuses on flexible and remote hiring solutions, allowing businesses to efficiently scale their development teams while benefiting from significant cost savings.
October 11

B2B • SaaS • Recruitment
Smart Working is a recruitment service specializing in sourcing and providing top-tier software developers from around the world to meet the needs of businesses. With a robust vetting process that includes technical assessments and background checks, Smart Working ensures that clients receive highly skilled developers adept in various programming languages and frameworks. The company focuses on flexible and remote hiring solutions, allowing businesses to efficiently scale their development teams while benefiting from significant cost savings.
• Run hypothesis-led analysis over large datasets to uncover trends, drivers of outcomes, and craft client-ready narratives. • Build and maintain industry benchmark datasets that power reports/dashboards; keep definitions and versioning tight. • Deliver clear, actionable Power BI reports for clients and internal stakeholders; maintain selected stand-alone reports in 3rd-party tools where required. • Own performance dashboards, operational processes, model version control/registry, and experiment tracking. • Monitor drift and bias, plan/validate improvements, and manage safe deploy/rollback. • Keep the feature store fresh from published calls; ensure training data lineage and reproducibility. • Partner on productised evaluations (automated tests, acceptance thresholds) and bias mitigation aligned to policy. • Specify, design, and implement dashboards & reports within the KAI solution; integrate with portal and API surfaces. • Collaborate with platform/DB teams on robust data integration & storage patterns across PostgreSQL/NoSQL and data lake assets. • Support Copilot auto-report creation where suitable, ensuring source-of-truth metrics and governance. • Run client-specific studies to test hypotheses and meet project goals; present findings to non-technical audiences.
• 5+ years — Python (pandas, NumPy, scikit-learn). • 4+ years — Databases & LLM familiarity: PostgreSQL and DynamoDB (or equivalent), familiarity with LLM concepts and evaluation methods. • 4+ years — Power BI & Excel stack: Power BI (data models, DAX), Power Query (M), and advanced Excel (pivots, complex formulas); other BI suites acceptable only with strong Excel experience. • 4+ years — Data warehousing & ETL. • 3+ years — Cloud data services & ML Ops tooling: exposure to AWS/Azure and ML Ops tools (feature store, experiment tracker, model registry, monitoring); GCP experience may be considered as a substitute. • Working style — Self-sufficient; ideally has led self-contained aspects of a project.
• Fixed Shifts: 12:00 PM - 9:30 PM IST (Summer) | 1:00 PM - 10:30 PM IST (Winter). • No Weekend Work: Real work-life balance, not just words. • Day 1 Benefits: Laptop and full medical insurance provided. • Support That Matters: Mentorship, community, and forums where ideas are shared. • True Belonging: A long-term career where your contributions are valued.
Apply NowOctober 10
5001 - 10000
Data Architect responsible for designing and implementing database solutions at CSG. Collaborating with architects to direct changes in business processes and technologies.
AWS
Postgres
October 7
1001 - 5000
Data Engineer leading projects in data engineering and integration for Escalent, a data analytics firm. Collaborating with clients and utilizing modern data architectures to set up unified consumer views.
Airflow
Azure
Cloud
ETL
Python
SQL
October 7
1001 - 5000
Data Engineer leading projects to create Unified Consumer View from multiple data sources. Working in a remote first/hybrid environment in data engineering and integration.
Airflow
Azure
Cloud
ETL
Python
SQL
October 7
Lead Data Engineer I at Ollion designing and implementing data platforms, APIs, and pipelines while leveraging cloud services and ensuring scalable solutions.
Airflow
AWS
Docker
IoT
JavaScript
Kubernetes
Microservices
Python
SDLC
SQL
October 3
Data Engineer enabling data-informed decision-making through scalable solutions and data infrastructure. Collaborate with cross-functional teams in the high-social-impact industry.
Airflow
Amazon Redshift
AWS
Cloud
EC2
ETL
NoSQL
PySpark
Python
Spark
SQL