Link to article in KevinMD.com
Will bots replace docs?
DAVID KERR, MD | TECH | AUGUST 29, 2017
In days of yore, people in distress usually turned to family, friends and occasionally the clergy for help but for many years now, much of the burden of dealing with human misery has been transferred to medical professionals. Going forward and given that the iPhone and social media are ubiquitous in the daily lives of most of us, it is probably no surprise that the latest psychological support system comes in the form of a chatbot.
For a modest fee and the use of Facebook messenger, individuals with anxiety or depression can now access the wonderfully named Woebot — a chatbot “you can tell anything to” that uses texts and emojis to provide self-help based on cognitive behavior therapy principles. Woebot doesn’t suggest medications but instead “functions as a friend” and unusually for this type of technology actually has some positive data behind it showing improvements in depressive symptoms earlier than anticipated.
Meanwhile, another company has created an artificial intelligence bot that claims to understand emotions. According to its creator, Emotibot will take your side after an argument a girl or boyfriend “by throwing angry face emoticons to your chat interface and use sarcastic tones such as, ‘Oh, I’m not surprised he or she did that.’”
For these types of technological innovations in the chatbot space, a major challenge is understanding natural language and the impact of illness, per se on the choice, understanding and interpretations of words. For example, there are at least 10 different ways to describe being unwell in the UK — including feeling off-color, iffy, gippy and poxy or having the lurgy — each with different nuances of meaning. Such phrases are also likely to be even more impenetrable to a robot if said with a Geordie, Scouse or Glaswegian accent. As front-line clinicians also know very well, a word or phrase can have different meanings based on the associated “body language” at the time of utterance although this may not be an insurmountable barrier as machine learning is also being applied to real-time images. However, with repeated conversations with a chatbot, there may be concerns about the psychological impact of this type of long-term human-machine interaction although apparently, some people have gone as far as to profess their undying love and propose an offer of marriage to their chatbots.
Clinicians also appreciate that simply providing information is rarely sufficient to impact the emotional burden of severe or long-term illness and that the presence of illness per se can influence understanding and interpretation of spoken or written information.
Creators and funders of artificial intelligence robotics, as well as futurists, need to appreciate that illness does impact the human condition in ways which can be illogical as well as unpredictable.
Even more importantly, clinical care is still more art than science and although robots can be created to provide sincerity, without empathy their value is markedly limited.
Given the growth in demand and cost of employing humans, chatbots could certainly fill in the gaps between visits to the GP surgery or hospital clinic. Moreover having you own personal (robotic) health care assistant on call 24 hours a day could be helpful with medication reminders, arranging appointments, interpreting blood test results (within limits), promoting more healthy lifestyles and minimizing the impact of acute trivial illness. However, making life changing diagnostic and therapeutic decisions needs validation from qualified human beings.
The introduction of personal health robots is unstoppable, but the actual role is a long way off from being clear, e.g. deciding at what point in a diagnostic and therapeutic pathway a human should take over? It is also very likely that regulators will want a piece of this particular robotic “action” — one wonders how long it will take for medically “qualified” robots to be required to undertake appraisal and revalidation and will we eventually need a robotic version of the General Medical Council to deal with robotic misdemeanors?
David Kerr is an endocrinologist.