Dialog: the First Step towards to an AI Doctor

Author: Vishal Nangalia

Imagine the near future, where your first port of call for medical advice is a conversation with a virtual doctor, an Artificial Intelligence (AI) that is available on your phone, your amazon echo, or even your smart TV. Steps are already being taken to get from our current state of one way web searches, to very basic chatbots. However, all these 'solutions' do not even come close what a conversation with a doctor would be like. The problem is, that the current virtual assistants are based on analysis of the curated content of existing web pages (such as NHS Choices/WebMD) and thus interactive AI based on these, results in a dialogue that not only is completely different from a real world interaction but is also mostly inaccurate in both identifying a patients problem, suggesting a diagnosis and ultimately recommending the next steps for their treatment. The solution proposed here is to leverage two existing technologies, specifically voice transcription and natural language processing, and couple them with the curation of the most extensive dataset of real world doctor patient interactions. In addition, the first version of this system will also alleviate two immediate pain points, one for patients and another for doctors. Dialog: a HIPAA compliant app that 1. Records the conversation between the patient and their healthcare professional, 2. Allows for the doctor to a) summarise their conversation and immediately incorporate this summary into their electronic health record, b) share this summary as part of their referral to another healthcare professional and c) also share this with the patient as well. The recording of the dialog will be encoded and hashed such that no changes can be made to the raw data, allowing for its use as an indisputable record. The summary will be transcribed using the most accurate cloud based voice transcription services available. In addition, the NLP on the summary will suggest ICD codes, such that automated billing can be deployed. The advantages for the doctor are clear, automatically transcribed and incorporated summaries with ICD codes will save time and increase revenue, while the patient gets an immediate record of their interaction that they can review repeatedly at home and share with their family and trusted friends, this is particularly important for elderly patients who have difficulty in recalling and understanding their doctors advice and instructions. As the dataset of both the full dialog and the curated summary of the interaction grows, advanced NLP can be deployed to create a semantic engine that understands the conversation in real time initially automatically generating the summary, but eventually turning into a full-fledged virtual doctor. This coudl be standalone service or a semantic engine that powers third party health offerings.

Co Author/Co-Investigator Names/Professional Title: David Meinhart