Chapter 5: Opportunities and Gaps in AI and the Hospital

The Evolution of AI in Patient Monitoring Systems

The integration of AI with patient sensing technology is becoming crucial in healthcare. Eric Yablonka highlights that while hospitals have traditionally focused on sensors for monitoring patients, the true innovation lies in using AI to analyze the data these sensors collect. This shift allows for actionable insights that enhance decision-making, addressing the industry’s historical lag in leveraging technology compared to other sectors. Narinder Singh notes that the real differentiation comes not just from data collection but from the ability to process information effectively—an area where AI can significantly improve patient outcomes. As the industry evolves, AI-driven solutions are expected to become standard in healthcare. 

Listen to Narinder Singh and Eric Yablonka discuss the opportunities and gaps in AI and the hospital. 

 

Video Transcript

Eric Yablonka: 

We’ve been talking about sensing for a very long time and putting sensors on patients both on the inpatient and on the ambulatory side for a very, very long time. We’ve talked with business partners about it and we’ve talked in industry meetings about it. And we know that in other industries, everything is sensored. Airplanes and airlines sensor all their equipment, otherwise they would not be able to handle the requirements to keep ’em up to date and they can’t intervene when there’s a signal that something is going wrong. So sensing is really critical with the ai, that is the piece that was missing before to be able to take those streams of sensor data. And again, that’s one reason we bought in ESB 10, 15 years ago was we wanted to stream data, large amounts of data and to be able to use it. We didn’t have AI back then today we have that and we can take that data and do really miraculous things with it. It’s a little early for some organizations, but one must think that in the next couple of years this is going to be standard and embedded in almost every product that we use. 

Narinder Singh: 

I like what you said there. I think it’s an interesting piece is that because we’ve only had the sensors in the past to focus a little bit too much on those, clearly it’s critical that I’d be able to see and hear as a person to be able to really navigate complex situations. But it’s not the seeing or the hearing that’s differentiated across us or across us versus animals. It’s the ability for our brains to process that information to make a decision. And so I think this notion of the AI is the key piece of being smart there and moving away from the sensors is the differentiated piece is still an early trend and really requires us to let go of what we learned the first 30 years of connecting and wiring things up. And I want to show an example of that for a second.