Data Engineer II

November 5

Apply Now
Logo of Duck Creek Technologies

Duck Creek Technologies

Insurance • SaaS • Enterprise

Duck Creek Technologies is a leading provider of comprehensive software solutions for the insurance industry, specifically focusing on property and casualty (P&C) insurance. Their platform offers a suite of core system solutions, including policy, billing, claims, and analytics. With a strong emphasis on low-code configurability, Duck Creek enables insurers to quickly implement changes and launch new products to meet market needs. The company also emphasizes partnerships with leading technology innovators to deliver enhanced customer experiences and streamline operations for insurers around the world.

1001 - 5000 employees

Founded 2000

☁️ SaaS

🏢 Enterprise

💰 $230M Private Equity Round on 2020-06

📋 Description

• Designs, codes, and/or configures solutions for moderate complexity Agile stories • Debugs and resolves moderate complexity software bugs or issues • Creates a conceptual design/architecture for small scale software solutions • Provides guidance and mentoring to more junior software engineers • Follows development standards and effectively demonstrates technical solutions to other software engineers in code reviews • Design, implement, and manage scalable data pipelines and ETL/ELT processes using Snowflake and related technologies • Develop and optimize complex SQL queries and data models to support business intelligence and analytics requirements • Work closely with data architects, analysts, and other stakeholders to understand business requirements and deliver effective data solutions • Ensure data quality, integrity, and security across all data platforms and pipelines • Perform data migration, integration, and transformation tasks from various sources into Snowflake • Troubleshoot and resolve issues related to data loads, performance, and data availability • Document data processes, technical specifications, and best practices • Stay updated on cloud native Saas architecture and industry trends, recommending improvements and new technologies as appropriate

🎯 Requirements

• Bachelor’s degree, or higher education level, or its foreign equivalent, in Computer Science, Computer Information Sciences, and/or related field • 10+ years total work experience (software development) • 6 years minimum experience • 4 years minimum product experience, 6+ years preferred • 4 years minimum domain experience, 6+ years preferred • Experience with cloud networking architectures and understanding of devops • Experience with industry best practices (e.g. Azure Infrastructure, Private Connectivity, Azure DevOps Services, CI/CD pipeline, Terraform on Azure) • Experience with other cloud data technologies (e.g., Snowflake, SQL Azure Data Factory, Azure Storage) is a plus • Knowledge of data governance, security, and compliance best practices • Familiarity with DevOps practices and CI/CD pipelines for data engineering • Experience with version control systems (e.g., Git) • Knowledge in Duck Creek Policy or Billing or Claims or Party or Clarity is a plus • Excels in managing deadlines and communicating in a team • Expert in how to estimate, analyze, and the Software Product Development Lifecycle with Agile methodology • Excels in Insurance domain knowledge • Legally authorized to work in the country of the job location.

🏖️ Benefits

• Health insurance • 401(k) matching • Flexible work hours • Paid time off • Remote work options

Apply Now

Similar Jobs

November 4

NextHire

11 - 50

Data Engineer / Integration Engineer at Algotale designing and maintaining scalable data pipelines. Collaborating with cross-functional teams to enhance data workflows across platforms.

🇮🇳 India – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

November 3

Avery Dennison

10,000+ employees

🤝 B2B

Data Engineer developing high-performance data pipelines using Oracle solutions at Avery Dennison. Collaborating with data teams to enhance operational efficiency and value for customers.

🇮🇳 India – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

October 31

NextHire

11 - 50

Azure Data Engineer developing and enhancing IT products in data analytics using Azure technologies. Collaborating with teams to ensure the delivery of efficient data solutions.

🇮🇳 India – Remote

⏰ Full Time

🟡 Mid-level

🟠 Senior

🚰 Data Engineer

October 31

NextHire

11 - 50

Principal Data Engineer responsible for developing big data solutions using Databricks and Azure. Collaborating with CX Data Labs to enhance data engineering processes.

🇮🇳 India – Remote

⏰ Full Time

🔴 Lead

🚰 Data Engineer

October 31

NextHire

11 - 50

Data Engineer/Sr. Data Engineer at Estuate Software, focused on ETL development and data pipeline management using GCP and Python. Joining a global tech company specializing in data-driven solutions.

🇮🇳 India – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

Developed by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com