SINGAPORE: Chatbots can provide some comfort to people with mental health issues, but they fall short when it comes to detecting suicidal tendencies or offering appropriate help in crisis situations, researchers have found.
A study of nine mental health chatbots by the Nanyang Technological University (NTU) showed that they empathise with users in conversations, but were unable to understand users who express suicidal tendencies or offer personalised advice.
Chatbots, or computer programmes that simulate human conversations, are increasingly used in healthcare. They are used to manage mental health conditions or support general well-being.
USING CHATBOTS TO OFFER TIMELY CARE, SUPPORT WELL-BEING
The use of chatbots comes at a time when people are more aware about their mental wellness.
“I think it’s important and probably COVID-19 was good to kind of bring mental health a bit more into the open and to really say to people that it is fine if they don’t feel well and they can talk about these things,” said Dr Laura Martinengo, a research fellow from NTU’s Lee Kong Chian School of Medicine.
“But also, we know that health professionals are not enough. So we need other ways to treat a larger amount of the population.”
Chatbots are especially useful as healthcare systems around the world are stretched and struggling to cope with a rising demand for their services, said observers. Those who feel stigmatised may be more willing to open up to chatting on a machine than talking to another person.
“Stigma is a big problem. I think when you don’t feel well, probably even hearing it from a machine helps,” Dr Martinengo told CNA on Tuesday (Dec 20).
“Also, sometimes, it’s very difficult for people with mental health disorders to actually talk about these things, and to tell people they don’t feel well.”
Some of the chatbots allow users to type in their feelings, while others guide them through a list of options.
Dr Martinengo said from the user interface and responses, these chatbots seem to be more oriented to the younger population.
“They will use words like buddy or WhatsApp, or language that probably the younger people use. So (the young) seem to be the target user group,” she added.
“They are able to ask for your name and obviously the system will remember your name, but there are not many other ways that the chatbots personalise the conversation.”