
B2B • SaaS • Recruitment
Smart Working is a recruitment service specializing in sourcing and providing top-tier software developers from around the world to meet the needs of businesses. With a robust vetting process that includes technical assessments and background checks, Smart Working ensures that clients receive highly skilled developers adept in various programming languages and frameworks. The company focuses on flexible and remote hiring solutions, allowing businesses to efficiently scale their development teams while benefiting from significant cost savings.
October 31

B2B • SaaS • Recruitment
Smart Working is a recruitment service specializing in sourcing and providing top-tier software developers from around the world to meet the needs of businesses. With a robust vetting process that includes technical assessments and background checks, Smart Working ensures that clients receive highly skilled developers adept in various programming languages and frameworks. The company focuses on flexible and remote hiring solutions, allowing businesses to efficiently scale their development teams while benefiting from significant cost savings.
• Lead the migration of data pipelines and schemas from SQL to Snowflake, ensuring reliability, scalability, and optimal performance. • Design and implement data warehouse architecture in Snowflake, including schema design, performance tuning, and SnowSQL scripting. • Build and maintain ETL/ELT pipelines using orchestration tools such as DBT, Astronomer, or Airflow. • Integrate and optimise data flow within Azure services, ensuring smooth operation across cloud environments. • Develop and maintain Python-based automation scripts for data transformation and workflow management. • Collaborate closely with analytics, product, and engineering teams to ensure data quality, accessibility, and consistency. • Support ongoing improvements in data governance, documentation, and monitoring standards.
• 5+ years of experience in data engineering, with strong expertise in data warehousing and ETL/ELT development. • Proven hands-on experience with Snowflake, including schema design, SnowSQL, and performance optimisation. • Practical experience with Azure data services (e.g., Data Factory, Data Lake, Synapse). • Proficiency with data pipeline orchestration tools such as DBT, Astronomer, or Apache Airflow. • Strong Python skills for automation and data manipulation. • Advanced proficiency in SQL for data modelling and query optimisation. • Strong understanding of scalable and secure data architecture design.
• Fixed Shifts: 12:00 PM - 9:30 PM IST (Summer) | 1:00 PM - 10:30 PM IST (Winter) • No Weekend Work: Real work-life balance, not just words • Day 1 Benefits: Laptop and full medical insurance provided • Support That Matters:Mentorship, community, and forums where ideas are shared • True Belonging: A long-term career where your contributions are valued
Apply NowOctober 28
Azure DataOps Lead responsible for operational delivery and automation of Azure-based data platforms for enterprise clients. Driving collaboration among Data Engineering, Cloud, and Business teams.
Azure
Cloud
Python
SQL
Vault
October 17
Senior Data Engineer at Syneos Health leading advanced software development in R and Python. Overseeing clinical data strategies and architecting scalable CI/CD pipelines for innovation.
Docker
Python
SQL
October 15
Senior Data Engineer developing cloud-based data analytics solutions. Collaborating with Technology Consulting team to address client business challenges and drive insights.
Azure
BigQuery
Cloud
ETL
Google Cloud Platform
Microservices
NoSQL
Python
SDLC
SQL
October 13
Data Architect designing canonical JSONs for clinical workflows. Collaborating to enhance data consistency, traceability, and patient care in the healthcare domain.
October 12
GCP Data Engineer supporting key client in retail industry with hands-on experience in GCP and Azure. Responsible for designing data pipelines and managing DBT models.
Airflow
Azure
BigQuery
Cloud
ETL
Google Cloud Platform
Python
SQL