Senior Data Engineer – Snowflake, Azure

Job not on LinkedIn

October 31

Apply Now
Logo of Smart Working

Smart Working

B2B • SaaS • Recruitment

Smart Working is a recruitment service specializing in sourcing and providing top-tier software developers from around the world to meet the needs of businesses. With a robust vetting process that includes technical assessments and background checks, Smart Working ensures that clients receive highly skilled developers adept in various programming languages and frameworks. The company focuses on flexible and remote hiring solutions, allowing businesses to efficiently scale their development teams while benefiting from significant cost savings.

51 - 200 employees

🤝 B2B

☁️ SaaS

🎯 Recruiter

📋 Description

• Lead the migration of data pipelines and schemas from SQL to Snowflake, ensuring reliability, scalability, and optimal performance. • Design and implement data warehouse architecture in Snowflake, including schema design, performance tuning, and SnowSQL scripting. • Build and maintain ETL/ELT pipelines using orchestration tools such as DBT, Astronomer, or Airflow. • Integrate and optimise data flow within Azure services, ensuring smooth operation across cloud environments. • Develop and maintain Python-based automation scripts for data transformation and workflow management. • Collaborate closely with analytics, product, and engineering teams to ensure data quality, accessibility, and consistency. • Support ongoing improvements in data governance, documentation, and monitoring standards.

🎯 Requirements

• 5+ years of experience in data engineering, with strong expertise in data warehousing and ETL/ELT development. • Proven hands-on experience with Snowflake, including schema design, SnowSQL, and performance optimisation. • Practical experience with Azure data services (e.g., Data Factory, Data Lake, Synapse). • Proficiency with data pipeline orchestration tools such as DBT, Astronomer, or Apache Airflow. • Strong Python skills for automation and data manipulation. • Advanced proficiency in SQL for data modelling and query optimisation. • Strong understanding of scalable and secure data architecture design.

🏖️ Benefits

• Fixed Shifts: 12:00 PM - 9:30 PM IST (Summer) | 1:00 PM - 10:30 PM IST (Winter) • No Weekend Work: Real work-life balance, not just words • Day 1 Benefits: Laptop and full medical insurance provided • Support That Matters:Mentorship, community, and forums where ideas are shared • True Belonging: A long-term career where your contributions are valued

Apply Now

Similar Jobs

October 28

Rackspace Technology

5001 - 10000

☁️ SaaS

Azure DataOps Lead responsible for operational delivery and automation of Azure-based data platforms for enterprise clients. Driving collaboration among Data Engineering, Cloud, and Business teams.

🇮🇳 India – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

October 17

Syneos Health

10,000+ employees

🧬 Biotechnology

💊 Pharmaceuticals

⚕️ Healthcare Insurance

Senior Data Engineer at Syneos Health leading advanced software development in R and Python. Overseeing clinical data strategies and architecting scalable CI/CD pipelines for innovation.

🇮🇳 India – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

October 15

NowVertical Group Inc.

201 - 500

🤖 Artificial Intelligence

💸 Finance

📱 Media

Senior Data Engineer developing cloud-based data analytics solutions. Collaborating with Technology Consulting team to address client business challenges and drive insights.

🇮🇳 India – Remote

💰 Post-IPO Debt on 2023-01

⏰ Full Time

🟠 Senior

🚰 Data Engineer

October 13

Zenara Health

1 - 10

☁️ SaaS

🤖 Artificial Intelligence

🤝 B2B

Data Architect designing canonical JSONs for clinical workflows. Collaborating to enhance data consistency, traceability, and patient care in the healthcare domain.

🇮🇳 India – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

October 12

Ciphercru Innovation Pvt Ltd

51 - 200

🤝 B2B

☁️ SaaS

🤖 Artificial Intelligence

GCP Data Engineer supporting key client in retail industry with hands-on experience in GCP and Azure. Responsible for designing data pipelines and managing DBT models.

🇮🇳 India – Remote

💵 ₹10k / month

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

Developed by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com