Abstract

Decision Support for Cost-Effective Diagnosis and Treatment by Inverting Bayesian Probability

Author: Gerald Loeb

Efficient differential diagnosis requires intelligent decisions about which data to obtain next in light of the expected utility and associated costs in terms of money, delay and risk. Given a database such as electronic health records that associate definitive diagnoses with tests performed to reach such conclusions in a large population of patients, it is possible to rank order the expected diagnostic benefit of any given test at any given time in any patient’s work-up. We previously developed a mathematical algorithm to optimize similar decisions based on an inversion of Bayesian decision-making. We applied it with unprecedented success on the constrained but difficult problem of identifying one of a large number of objects based on exploratory movements and tactile information (Fishel and Loeb, Frontiers in Neurorobotics, 2012, doi:10:338/fnbot.2012.00004; Loeb & Fishel, Bayesian Action&Perception: Representing the World in the Brain, Frontiers in Neuroscience, 2014, doi:10.3389/fnins.2014.00341). Intelligent decisions about which exploratory movement to perform next were based on confusion matrices among all of the objects weighted by the current probabilities of hypotheses regarding the unknown object’s identity. This approach appears to have overcome the usual “curse of dimensionality” that encumbers the training of neural networks to discriminate objects with many unrelated attributes. Bayesian Action&Perception is very similar to differential diagnosis. At each decision-point in the medical diagnostic process, the algorithm can be used to rank-order the available diagnostic tests according to the likelihood that they will be discriminative according to the database and weighted by current diagnostic probabilities. A Markov chain can be constructed to weight the aggregate costs by the probability of their being incurred for various paths to a definitive diagnosis. By defining wellness as a diagnostic state, the same algorithm can be used to tailor life-style and follow-up recommendations that are likely to achieve and maintain that state for individual patients. By defining treatment outcomes as diagnostic states (e.g. cancer in remission, pneumonia resolved, etc.), the same algorithm can be used to rank-order the cost-effectiveness of the available treatment options. By including genetic marker information from a large population of patients, the same algorithm can provide a form of personalized medicine without waiting for specifically designed, controlled trials. Our clinical decision support system is NOT based on expert opinion, fundamental pathophysiology or meta-analyses of journal articles, all of which are prone to disagreement and obsolescence. It uses only the database of clinical experience in the electronic health records, which is continually growing. If new diagnoses, tests or treatments start to be used clinically, the database will automatically include those experiences, and the algorithm will automatically consider them according to the statistical power that those cumulative experiences have achieved. Large and persistent deviations in practice from the recommendations of the algorithm can be used to identify excessive use of expensive, risky or obsolete procedures. Full implementation of this algorithm will require but will also motivate improvements in the quality and completeness of electronic health records.

Co Author/Co-Investigator Names/Professional Title: Gerald E. Loeb, M.D., Professor of Biomedical Engineering, University of Southern California Jeremy A. Fishel, Ph.D., Chief Technology Officer, SynTouch Inc.

Funding Acknowledgement (If Applicable): Biomed Concepts, Inc.