blogs

Forecasting the next steps in AI and healthcare

Posted: 16 January 2017 | By Darcie Thompson-Fields

We are currently seeing products that loosely use AI around hospitals but we are on the verge of new developments.  Whilst there are many areas of healthcare that can be assisted by AI, some key issues that can be solved or benefited by the implementation of AI are training, surgical robotic assistance and the use of data to anticipate patient-care needs.

Training

Currently, online virtual worlds are being used in hospitals to teach communication tools to surgeons. As a surgeon you’re the captain, you have to lead a team throughout a surgery, but many surgeons haven’t had the opportunity to experience being a leader and communicating effectively. With this software, a surgeon can log-on and practice leading a team in an operating room. Teaching a surgeon to respond appropriately and not emotionally to a situation within a team is an essential skill.

The avatar training models use AI logic similar to video games, but calling that full AI would be a stretch. The game follows a script that knows if you’re going down the right or wrong path and responds appropriately. If we could infuse these systems with more formal AI and make them smarter they could be even more effective for training across the whole of the hospital. Due to the schedules of doctors and nurses, it can be very difficult to bring groups together for training. Allowing this training to take place virtually could solve that issue.

Robots with vision

Major advances in robotic surgery allow doctors to perform many types of complex procedures with more precision, flexibility and control than is possible with conventional techniques. However, the next step, one that we might see in 2018 and beyond, will be building AI and machine learning into these surgical robots.

This technology could allow the robot to act as an assistant in surgery rather than just a tool. If we could fit surgical robots with computer vision and image recognition, allowing it to see what you see, it could provide warning signs on the location of tissue and issue warnings to avoid causing damage.

Google’s Verb Surgical is already working with physicians around the world on the next surgical robot. The robot will be equipped with machine learning and have access to a cloud full of data on thousands of previous surgeries. This means it could call on this information when performing a surgery and make suggestions based on similar surgeries that have been performed previously.

Patient Data

Hospital systems collect data on 100,000s patients each year but currently any lessons to be had from that data about population health is lost. Due to individuals remaining anonymous there are no privacy concerns to be had but we lack the technology within products from vendors.

Last year we had around 140,000 in-patients. If we could mine that data and use machine learning to ask questions of it, that would help us to discover the best medical treatments based on many variables. It could also help hospitals know where best to position their services, for instance, if we noticed patients coming in for radiation treatment and could spot a trend in location data, it could lead us to establish a treatment centre in the affected area.

Our primary focus is as healthcare provider and we outsource all our technology solutions, we expect to see a professional quality product with these capabilities from a vendor very soon.

 

Roger Smith is Chief Technology Officer for Florida Hospital Nicholson Center and is Graduate Faculty at the University of Central Florida and President of Simulation First where he provides keynote event presentations and scientific lectures.

Related industries

Related functions

Related topics