An electronic fly on the clinical wall: voice-recognition AI in medicine ======================================================================== * Tim Lougheed Like many adult children of octogenarian parents, Edmonton entrepreneur Dave Damer takes every opportunity to accompany them on visits to the doctor. He isn’t trying to interfere with their care but does want to ensure they understand what they are told, especially if it’s something that demands a particular action. That goal is reflected in the product being developed by Damer’s company, Testfire Labs. The small start-up has big plans for using artificial intelligence (AI) to change the way we approach any sort of meeting, from the largest corporate gathering to the most intimate doctor–patient consult. The technology could keep track of clinical encounters and provide a documented review afterward. Best of all, patients would have a record to help them appreciate the importance of their appointment. “Being able to walk away with something written from the doctor’s appointment, something that summarizes ‘here are the issues we talked about; here are the medications we talked about’ — this is well within the realm of feasibility,” he said. “It could be ready immediately after the appointment, so people have it in their hands.” Testfire’s system, called Hendrix.ai, can be dialed into the proceedings on a phone like any remote participant in a meeting. It can provide a complete transcript of what everyone said, as well as identify key points and action items, and eventually it will provide an analysis of the emotional state expressed by various individuals. It may sound like science fiction, but Damer and his colleagues have assembled their product from software that is already being employed around the world by high-tech giants like Amazon and Google. ![Figure1](http://www.cmaj.ca/https://www.cmaj.ca/content/cmaj/191/21/E591/F1.medium.gif) [Figure1](http://www.cmaj.ca/content/191/21/E591/F1) Voice-recognition software could enable patients to have a record of everything discussed at medical appointments. Image courtesy of [iStock.com/Eva-Katalin](http://iStock.com/Eva-Katalin) The idea of inviting an invisible third party into a clinical setting may be unsettling for some. Many have questioned the implications of a consumer AI purchase like Amazon’s Alexa, which listens in as we go about life in our homes. Yet champions of the technology, such as American cardiologist Dr. Eric Topol, have countered this response by showing how it can relieve medical staff of administrative burdens, such as note-taking or tracking prescriptions, allowing doctors and nurses to spend more time with patients. He painted such a picture for *CMAJ* in 2013 and in a 2018 report to the UK National Health Service, suggesting that AI will provide productivity gains that save money and promote better communications. However, as anyone who has ever used existing voice transcription software knows, it isn’t always ready for prime time. Words are regularly missed or garbled, especially if they are technical in nature, as will often be the case in medical contexts. Developers of current products promise to resolve this problem through machine learning that will provide the ability to develop a robust and accurate lexicon by reviewing huge numbers of examples of specialized, spoken language. Beyond the prospect of mechanical accuracy, though, is the potential of even more tantalizing gains, such as the ability to enlist AI in the diagnosis of a patient’s condition. “We can use machine learning to pick out differences in speech that indicate changes to the brain related to dementia,” said Frank Rudzicz, a computer science researcher with the Vector Institute for Artificial Intelligence in Toronto. “We now have about 2000 separate measures, from differences in pitch to counting the number of nouns.” But even AI advocates, such as computer science professor Eleni Stroulia of the University of Alberta, are aware of the impact such technology will have on medical practice. “We have to be aware of the fact that we are changing what happens and what this might imply about the diagnostic or therapeutic process,” she said. “It’s one thing to talk with your doctor and the doctor scratches a few things here and there; it’s another thing to know that exactly everything I say is being transcribed. Will I talk the same way?” ## Footnotes * Posted on [cmajnews.com](http://cmajnews.com) on May 8, 2019.