Snowflake Data Engineer

March 17

Apply Now

iSoftTek Solutions Inc

Connecting IT Experts with top companies across the nation

201 - 500

Description

• Are you a Data Engineer working at a Large Financial Institution and being told by your leadership that you are too hands-on or detail-oriented or think and work like a start-up? • We are looking forward to you joining our Platform Engineering Team. • Our Platform Engineering Team is working to solve the Multiplicity Problem. We are trusted by some of the most reputable and established FinTech Firms. Recently, our team has spearheaded the Conversion & Go Live of apps that support the backbone of the Financial Trading Industry. • Design, develop, and maintain data pipelines to ingest, transform, and load data from various sources into Snowflake. • Implement ETL (Extract, Transform, Load) processes using Snowflake's features such as Snowpipe, Streams, and Tasks. • Design and implement efficient data models and schemas within Snowflake to support reporting, analytics, and business intelligence needs. • Optimize data warehouse performance and scalability using Snowflake features like clustering, partitioning, and materialized views. • Integrate Snowflake with external systems and data sources, including on-premises databases, cloud storage, and third-party APIs. • Implement data synchronization processes to ensure consistency and accuracy of data across different systems. • Monitor and optimize query performance and resource utilization within Snowflake using query profiling, query optimization techniques, and workload management features. • Identify and resolve performance bottlenecks and optimize data warehouse configurations for maximum efficiency. • Work on Snowflake modeling – roles, databases, schemas, ETL tools with cloud-driven skills • Work on SQL performance measuring, query tuning, and database tuning • Handle SQL language and cloud-based technologies • Set up the RBAC model at the infra and data level. • Work on Data Masking / Encryption / Tokenization, Data Wrangling / ECreLT / Data Pipeline orchestration (tasks). • Setup AWS S3/EC2, Configure External stages, and SQS/SNS • Perform Data Integration e.g. MSK Kafka connect and other partners like Delta Lake (data bricks)

Requirements

• Data Wrangling • ETL • Talend • Jasper • Java • Python • Unix • AWS • Data Warehousing • Data Modeling • Database Migration • ECreLT • RBAC model • Data migration

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@remoterocketship.com
Jobs by Title
Remote Account Executive jobsRemote Accounting, Payroll & Financial Planning jobsRemote Administration jobsRemote Android Engineer jobsRemote Backend Engineer jobsRemote Business Operations & Strategy jobsRemote Chief of Staff jobsRemote Compliance jobsRemote Content Marketing jobsRemote Content Writer jobsRemote Copywriter jobsRemote Customer Success jobsRemote Customer Support jobsRemote Data Analyst jobsRemote Data Engineer jobsRemote Data Scientist jobsRemote DevOps jobsRemote Engineering Manager jobsRemote Executive Assistant jobsRemote Full-stack Engineer jobsRemote Frontend Engineer jobsRemote Game Engineer jobsRemote Graphics Designer jobsRemote Growth Marketing jobsRemote Hardware Engineer jobsRemote Human Resources jobsRemote iOS Engineer jobsRemote Infrastructure Engineer jobsRemote IT Support jobsRemote Legal jobsRemote Machine Learning Engineer jobsRemote Marketing jobsRemote Operations jobsRemote Performance Marketing jobsRemote Product Analyst jobsRemote Product Designer jobsRemote Product Manager jobsRemote Project & Program Management jobsRemote Product Marketing jobsRemote QA Engineer jobsRemote SDET jobsRemote Recruitment jobsRemote Risk jobsRemote Sales jobsRemote Scrum Master + Agile Coach jobsRemote Security Engineer jobsRemote SEO Marketing jobsRemote Social Media & Community jobsRemote Software Engineer jobsRemote Solutions Engineer jobsRemote Support Engineer jobsRemote Technical Writer jobsRemote Technical Product Manager jobsRemote User Researcher jobs