Senior Data Engineer

Job not on LinkedIn

October 28

Apply Now
Logo of Knowledge Anywhere

Knowledge Anywhere

SaaS • Education • Enterprise

Knowledge Anywhere is a provider of a feature-rich Learning Management System (LMS) designed to streamline and enhance corporate training programs. The company offers a centralized platform that allows businesses to create, manage, and track training courses for employees, customers, or partners. The LMS includes advanced features such as assessments with AI, badges & leaderboards, course conversion tools, and reporting & analytics, among others. Knowledge Anywhere caters to various industries such as financial services, healthcare, manufacturing, and more. They also provide custom course development, a large course library, and virtual reality training solutions to meet diverse training needs. With a focus on compliance, employee onboarding, and talent development, Knowledge Anywhere aims to improve training efficiency and drive organizational growth.

11 - 50 employees

Founded 1998

☁️ SaaS

📚 Education

🏢 Enterprise

📋 Description

• As a Senior Data Engineer, you’ll be responsible for building high performance, scalable data solutions that meet the needs of millions of agents, brokers, home buyers, and sellers. • You’ll design, develop, and test robust, scalable data platform components. • You’ll work with a variety of teams and individuals, including product engineers to understand their data pipeline needs and come up with innovative solutions. • You’ll work with a team of talented engineers and collaborate with product managers and designers to help define new data products and features.

🎯 Requirements

• BS/MS in Computer Science, Engineering, or related technical discipline or equivalent combination of training and experience. • 5+ years core Scala/Java experience: building business logic layers and high-volume/low latency/big data pipelines. • 3+ years of experience in large scale real-time stream processing using Apache Flink or Apache Spark with messaging infrastructure like Kafka/Pulsar. • 5+ years of experience on Data Pipeline development, ETL and processing of structured and unstructured data. • 3+ years of experience using NoSQL systems like MongoDB, DynamoDB and Relational SQL Database systems (PostgreSQL) and Athena. • Experience with technologies like Lambda, API Gateway, AWS Fargate, ECS, CloudWatch, S3, DataDog. • Experience owning and implementing technical/data solutions or pipelines. • Excellent written and verbal communication skills in English. • Strong work ethic and entrepreneurial spirit.

Apply Now

Similar Jobs

October 28

Data Engineer designing and maintaining scalable data pipelines for a leading Credit Platform Data team, enabling data-driven decision-making across the organization.

ETL

Linux

MySQL

Oracle

Pandas

PySpark

Python

RDBMS

SQL

Unix

October 28

Clinical Data Architect managing data architecture and clinical data integration for healthcare solutions. Collaborating with teams to ensure integrity and quality of ophthalmology data.

AWS

Cloud

ETL

Postgres

Python

SQL

October 28

Senior Data Engineer driving impactful data solutions at Mitek. Building scalable data systems for advanced analytics and business intelligence initiatives.

Airflow

AWS

Distributed Systems

MapReduce

NoSQL

Postgres

Python

SQL

Tableau

Terraform

October 25

Data Engineer architecting data solutions for enhancing patient care in healthcare. Collaborating with cross-functional teams to turn data into actionable insights.

Airflow

Amazon Redshift

Apache

AWS

Azure

BigQuery

Cloud

ETL

Python

Scala

SQL

October 25

Senior Data Engineer at Curinos, building B2B SaaS applications for financial institutions. Collaborating with engineers, AI scientists, and managers to deliver high-quality data-driven solutions.

Airflow

ETL

Python

SDLC

Spark

SQL

Unity

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com