
Healthcare Insurance • SaaS • Wellness
Inspiren is a comprehensive solution for managing and optimizing operations in senior living communities. The platform enhances resident care by providing decision-grade data analysis for care planning, staffing optimization, and resident safety. Inspiren uses advanced technology, such as AUGi, to enable proactive care, improve workflow efficiency, and align care plans with resident needs. The technology's capabilities include real-time alerts for incidents and early risk flagging, tracking resident interactions, and monitoring staff effectiveness, thus ensuring high-quality resident care and operational efficiency.
12 hours ago

Healthcare Insurance • SaaS • Wellness
Inspiren is a comprehensive solution for managing and optimizing operations in senior living communities. The platform enhances resident care by providing decision-grade data analysis for care planning, staffing optimization, and resident safety. Inspiren uses advanced technology, such as AUGi, to enable proactive care, improve workflow efficiency, and align care plans with resident needs. The technology's capabilities include real-time alerts for incidents and early risk flagging, tracking resident interactions, and monitoring staff effectiveness, thus ensuring high-quality resident care and operational efficiency.
• Collaborate with engineering, data science, ML, and product analytics teams to develop data models and pipelines for customer-facing applications, research, reporting, and machine learning. • Develop, implement, and optimize ETL processes for ingesting, processing, and transforming large volumes of structured and unstructured data into our data ecosystem. • Optimize data models to support efficient data storage and retrieval processes for performance and scalability. • Evaluate and implement a variety of data storage solutions, including RDS, NoSQL, data lakes, and cloud storage services. • Work in close partnership with Platform Engineering to influence the direction and needs of the data platform.
• Bachelor's or Master's degree in Computer Science, or a related engineering field. • 8+ years of full stack or backend development experience. • Fluency in building and maintaining ETL processes. • Outstanding analytical skills and the ability to address problems in real-world settings. • A demonstrated ability to work in a team, with excellent skills-sharing capabilities. • Expertise in modern ETL technologies and building/supporting data pipelines at scale. • Proven experience in evaluating and optimizing data architectures to increase performance, data discovery, and reduce cost. • Proven experience with cloud-based data engineering pipeline design at scale. AWS and Databricks experience are plusses. • Proven proficiency in one or more programming languages such as Python or Java, as well as SQL. • Well-versed in the development lifecycle and software engineering best practices.
• Health insurance • Dental insurance • Vision insurance • Flexible PTO
Apply NowYesterday
Data Architect crafting enterprise data architecture for Dynatron, boosting analytics and ML. Involved in real-time data processing and governance in automotive service industry.
Amazon Redshift
AWS
Azure
BigQuery
Cloud
Distributed Systems
Google Cloud Platform
Kafka
Pulsar
Python
Scala
SQL
Vault
3 days ago
Data Engineer building technology strategy for Conduent by designing and implementing data pipelines and transformations. Collaborating with business analysts and clients to meet complex data integration needs.
🇺🇸 United States – Remote
💵 $96.3k - $125k / year
💰 Venture Round on 2009-01
⏰ Full Time
🟠 Senior
🔴 Lead
🚰 Data Engineer
🦅 H1B Visa Sponsor
Azure
ETL
Numpy
Oracle
PySpark
Python
SQL
4 days ago
Chief Data Architect leading development of data strategy and architecture for OSC Edge. Providing technical expertise and overseeing complex data architecture projects in a remote capacity.
TypeScript
5 days ago
Manager, Data Engineering leading Keeper Security's Data Engineering team to architect and maintain data infrastructures and pipelines. Collaborating closely with data scientists and business stakeholders.
Amazon Redshift
AWS
Cloud
NoSQL
Python
SQL
Tableau
December 3
Data Architect designing a data foundation to support AI/ML applications at Leidos. Responsible for creating big data systems and ensuring data access and documentation.
🇺🇸 United States – Remote
💵 $104.7k - $189.2k / year
⏰ Full Time
🟠 Senior
🔴 Lead
🚰 Data Engineer
🦅 H1B Visa Sponsor
ETL
TypeScript