Senior Data Engineer – ML Engineer

Job not on LinkedIn

November 5

Apply Now
Logo of Allwyn Corporation

Allwyn Corporation

Artificial Intelligence • Enterprise • B2B

Allwyn Corporation is a technology services company that helps organizations modernize and transform through data-driven solutions. The company provides AI and machine learning, data analytics and business intelligence, enterprise application development, IT modernization, agile software delivery, low-code development, and offshore delivery services, with expertise in building unified data platforms and enabling operational efficiency. Allwyn primarily serves enterprise and government clients to accelerate digital transformation and improve business processes.

📋 Description

• Demonstrate expert ability in implementing data warehouse solutions using Snowflake. • Building data integration solutions between transaction systems and analytics platform. • Expand data integration solutions to ingest data from internal and external sources and to further transform as per the business consumption needs. • Create security policies in Snowflake to manage fine grained access control • Develop tasks for a multitude of data patterns, e.g., real-time data integration, advanced analytics, machine learning, BI and reporting. • Lead POC efforts to build foundational AI/ML services for Predictive Analytics. • Building of data products by data enrichment and ML. • Be a team player and share knowledge with the existing team members. • Demonstrated ability to be self-directed with excellent organization, analytical and interpersonal skills, and consistently meet or exceed deadline deliverables. • Strong communication skills to facilitate meetings and workshops to collect data, functional and technology requirements, document processes, data flows, gap analysis, and associated data to support data management/governance related efforts. • Knowledge and understanding of data standards and principles to drive best practices around data management activities and solutions. • Strong understanding of the importance and benefits of good data quality, and the ability to champion results across functions. • Experience with one or more data integration tools viz. Attunity (Qlik), AWS Glue ETL, Talend, Kafka etc. • Strong understanding of data security – authorization, authentication, encryption, and network security. • Ability to lead collaborative meetings which result in clearly documented outcomes, a concrete understanding of meeting attendee performance/reliability, and ongoing management & follow-up for action items. • Acts with integrity and proactively seeks ways to ensure compliance with regulations, policies, and procedures.

🎯 Requirements

• Bachelor’s degree in computer science or a related field • Minimum of 6-8 years of experience in building data driven solutions. • Expertise in real-time data solutions, good-to-have knowledge of streams processing, Message Oriented Platforms and ETL/ELT Tools. • Strong scripting experience using Python and SQL • Working knowledge of foundational AWS compute, storage, networking and IAM. • Solid scripting experience in AWS using lambda functions. Good to have knowledge of CloudFormation template. Overall experience with AWS services should be over three years. • Hands on experience with popular cloud-based data warehouse platforms, viz. Redshift, Snowflake. • Experience in building data pipelines with related understanding of data ingestion, transformation of structured, semi-structured and unstructured data across cloud services • Hands on experience in using and extending machine learning framework and libraries, e.g, scikit-learn, PyTorch, TensorFlow, XGBoost etc. • Experience with AWS SageMaker family of services or similar tools to develop machine learning models. • Understanding of Gen AI models, prompt engineering, RAG, fine tuning and pre-tuning will be a plus.

Apply Now

Similar Jobs

November 5

Data Architect shaping enterprise data ecosystems for Abercrombie & Fitch. Collaborating on data architecture to optimize systems and processes for data-driven decision making.

AWS

Azure

Cloud

Google Cloud Platform

Kafka

Spark

November 4

Using data engineering skills to support complex decision-making processes. Collaborating with teams to develop data-driven solutions at Two Six Technologies.

AWS

Cloud

ElasticSearch

Postgres

Python

SQL

November 4

Data Engineering Lead/Architect designing and maintaining scalable data pipelines. Collaborating with teams for analytics and machine learning projects with a focus on large data systems.

Airflow

Apache

AWS

Azure

Cloud

ETL

Google Cloud Platform

Kafka

Pandas

PySpark

Python

Spark

November 4

Remote Data Engineer focusing on GCP and data pipelines to support analytics and data science. Collaborating with product managers to develop solutions in a cloud environment.

Airflow

Apache

Cloud

Google Cloud Platform

Python

SQL

November 4

Salesforce Data Architect focused on designing and managing data architecture in Salesforce for a leading system integrator. Collaborating with stakeholders and ensuring system performance and data integrity.

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com