Gen AI Data Engineer

Job not on LinkedIn

April 15

Apply Now
Logo of Tiger Analytics

Tiger Analytics

Artificial Intelligence • B2B • Consulting

Tiger Analytics is a leading AI and analytics consulting firm that specializes in leveraging data science and machine learning to provide strategic business insights across various industries. They offer services in data strategy, AI engineering, and business intelligence to enable data-driven decision-making and digital transformation for their clients. Tiger Analytics collaborates with top technology partners like Microsoft, Google Cloud, and AWS to deliver cutting-edge solutions. They serve a diverse range of sectors including consumer packaged goods, healthcare, and finance, helping businesses operationalize insights and differentiate with AI and machine learning technologies.

1001 - 5000 employees

Founded 2011

🤖 Artificial Intelligence

🤝 B2B

📋 Description

• Tiger Analytics is looking for experienced Machine Learning Engineers with Gen AI experience to join our fast-growing advanced analytics consulting firm. • Our employees bring deep expertise in Machine Learning, Data Science, and AI. • We are the trusted analytics partner for multiple Fortune 500 companies, enabling them to generate business value from data. • Our business value and leadership has been recognized by various market research firms, including Forrester and Gartner. • We are looking for top-notch talent as we continue to build the best global analytics consulting team in the world. • You will be responsible for: • Technical Skills Required: • Programming Languages: Proficiency in Python, SQL, and PySpark. • Data Warehousing: Experience with Snowflake, NOSQL and Neo4j. • Data Pipelines: Proficiency with Apache Airflow. • Cloud Platforms: Familiarity with AWS or GCP • Operating Systems: Experience with Linux. • Batch/Realtime Pipelines: Experience in building and deploying various pipelines. • Version Control: Experience with GitHub. • Development Tools: Proficiency with VS Code. • Engineering Practices: Skills in testing, deployment automation, DevOps/SysOps. • Communication: Strong presentation and communication skills. • Collaboration: Experience working with onshore/offshore teams.

🎯 Requirements

• Desired Skills: • Big Data Technologies: Experience with Hadoop and Spark. • Data Visualization: Proficiency with Streamlit and dashboards. • APIs: Experience in building and maintaining internal APIs. • Machine Learning: Basic understanding of ML concepts. • Generative AI: Familiarity with generative AI tools and techniques. • Additional Expertise: • Knowledge Graphs: Experience with creation and retrieval. • Vector Databases: Proficiency in managing vector databases. • Data Persistence: Ability to develop and maintain multiple forms of data persistence and retrieval methods (RDMBS, Vector Databases, buckets, graph databases, knowledge graphs, etc.). • Cloud Technologies: Experience with AWS, especially SageMaker, Lambda, OpenSearch. • Automation Tools: Experience with Airflow DAGs, AutoSys, and CronJobs. • Unstructured Data Management: Experience in managing data in unstructured forms (audio, video, image, text, etc.). • CI/CD: Expertise in continuous integration and deployment using Jenkins and GitHub Actions. • Infrastructure as Code: Advanced skills in Terraform and CloudFormation. • Containerization: Knowledge of Docker and Kubernetes. • Monitoring and Optimization: Proven ability to monitor system performance, reliability, and security, and optimize them as needed. • Security Best Practices: In-depth understanding of security best practices in cloud environments. • Scalability: Experience in designing and managing scalable infrastructure. • Disaster Recovery: Knowledge of disaster recovery and business continuity planning. • Problem-Solving: Excellent analytical and problem-solving abilities. • Adaptability: Ability to stay up-to-date with the latest industry trends and adapt to new technologies and methodologies. • Team Collaboration: Proven ability to work well in a team environment and contribute to a positive, collaborative culture. • GenAI Engineer Specific Skills: • Industry Experience: 8+ years of experience in data engineering, platform engineering, or related fields, with deep expertise in designing and building distributed data systems and large-scale data warehouses. • Data Platforms: Proven track record of architecting data platforms capable of processing petabytes of data and supporting real-time and batch ingestion processes. • Data Pipelines: Strong experience in building robust data pipelines for document ingestion, indexing, and retrieval to support scalable RAG solutions. Proficiency in information retrieval systems and vector search technologies (e.g., FAISS, Pinecone, Elasticsearch, Milvus). • Graph Algorithms: Experience with graphs/graph algorithms, LLMs, optimization algorithms, relational databases, and diverse data formats. • Data Infrastructure: Proficient in infrastructure and architecture for optimal extraction, transformation, and loading of data from various data sources. • Data Curation: Hands-on experience in curating and collecting data from a variety of traditional and non-traditional sources. • Ontologies: Experience in building ontologies in the knowledge retrieval space, schema-level constructs (including higher-level classes, punning, property inheritance), and Open Cypher. • Integration: Experience in integrating external databases, APIs, and knowledge graphs into RAG systems to improve contextualization and response generation. • Experimentation: Conduct experiments to evaluate the effectiveness of RAG workflows, analyze results, and iterate to achieve optimal performance.

🏖️ Benefits

• This position offers an excellent opportunity for significant career development in a fast-growing and challenging entrepreneurial environment with a high degree of individual responsibility.

Apply Now

Similar Jobs

April 13

Senior Data Engineer role at thinkbridge, leveraging technology strategy and development expertise for growth-stage companies. Focus on solution architecture and client engagement.

Azure

ETL

SQL

SSIS

April 13

As an AI/NLP/Data Engineer, improve ML and AI models for Agent IQ's fintech platform.

Airflow

AWS

Docker

Google Cloud Platform

JavaScript

Linux

Node.js

Python

PyTorch

April 11

Join Genesis Consulting as a Remote Data Engineer to support Data Management and Analytics for a Federal Agency.

Amazon Redshift

Apache

AWS

Cloud

ETL

Node.js

Python

Scala

Spark

SQL

April 7

AMOS Data Setup and Migration Specialist assisting data migration project for an aviation company. Focused on data configuration, mapping, and compliance in AMOS platform.

ETL

April 3

Join Kindred as a Data Engineer to shape its data platform and build a sharing economy.

Airflow

BigQuery

Cloud

Kafka

Spark

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com