Chatbot Therapy: It’s Quite Effective… but It Can’t Give You a Hug

, , , , , , , , ,

Chatbot Therapy: It’s Quite Effective… but It Can’t Give You a Hug

By Broderick Jones and Regine Jones, EPAM Systems, Inc.

  1. How Effective Is Psychological Chat?

New York-based clinical psychologist Noel Hunter says chatbots can be an effective therapy to address “mild feelings of loneliness.” Her statement is supported by several consistent data points, including the impact of Google’s Memory Lane, which prompted doctors to conclude that having “someone”—which is to say, a chatbot—to speak with regularly has a profound impact on health and wellbeing. In addition, researchers at the University of British Columbia found that even minor interpersonal exchanges with a chatbot can increase one’s social and emotional wellbeing. Other studies have concluded that chatbots have potential in treating mental health issues and could be used as an effective tool for delivering cognitive behavioral therapy. Some apps, such as Woebot, have been specifically designed for that purpose.

Beyond formal studies, chatbot effectiveness is further demonstrated by the over-half-a-million people who downloaded the chatbot-enabled app, Replika; the 17% uptick in traffic for chatbot Mitsuku when lockdowns came into effect; and the fact that Wysa, a chatbot designed specifically to give mental health advice, had 95% more installs from February to June 2020 (compared to the same period last year). This increased utilization clearly signals a belief in effectiveness and has been further supported by the investment made by various governments, the UK, and Sweden among the leaders.

Obviously, there are limitations due to the current level of technological advancement. Tech cannot currently replace, as 70% of our communication is nonverbal. Or more simply: It can’t give you a hug or hold your hand.

  1. The Risks and Benefits of Bots

While there are therapeutic benefits as outlined above, chatbots can also help trigger or motivate a user to go out and try something or give advice on how to start a conversation with someone in real life. They can serve as an excellent judgement-free way to vent, explore one’s own thoughts, or “journal” with response. As these tools become more technologically sophisticated through machine learning, they can begin to consume and then replicate an individual’s writing style, slang, expressions so that their responses are more authentic and can make the user feel more like they are communicating with a friend. Chatbots could no doubt be great tools for helping people develop emotional strength and interpersonal skills. 

There are a few key risks associated with these tools.

Addiction or similar harm. Have you ever used an application that kept track of a “streak” of use? The attempt to keep that streak going or the significant feeling of loss when the streak ended has been proven to have a negative effect on people, and chatbot interaction could have similar harmful addictive impacts.  As professionals with an avid interest in human nature, we should engage with chatbots now and on an ongoing basis so that we are more versed in the potential and can be more equipped for the risks.

Data privacy and security. It’s essential to have appropriate protections in this area because chatbots can collect a massive amount of personal data. 

Managing moral issues. Bots can learn lousy behaviors by themselves and talk about issues that could be triggers for many people, depending on their personal values and beliefs, along with the status of their mental health. This is an area that needs distinct attention.

Further retreat from reality. Users of tools like Replika have claimed to develop “romantic” relationships with their devices. A relationship with a chatbot can become problematic when the user moves too far from reality and relies on technology at the expense of actual human contact. When users open up about a sensitive situation, the chatbot should find a way to tell them: “Go talk to a real person.” Human contact, from eye contact to touch, is essential in any mental healing process. To further bolster this point, two of the contributing causes to loneliness are: (1) a lack of social support and infrequent meaningful social interactions; and (2) a lack of community. 

There are gaps that a chatbot will be unable to address in the long term. For instance, when people watch a movie or TV show or log onto social media and see images of genuine human connections, they’ll be reminded that they’re still missing out on meaningful interactions with other people, and this may serve to keep them in a cycle of recurring or even worsening loneliness. 

Since loneliness can cause depression, the other risk is that chatbots will be unable to delineate between loneliness and depression in the user, the way a human therapist would. While the former can be solved for with social connections or technological solutions, the latter requires diagnosis and pharmacological intervention. 

  1. What If We Bring a Real Therapist into the Equation?

Research has clearly proven the value of speaking with a therapist, this is not in question. Chat therapy service Talkspace boasts over 1 million users since its inception in 2012, and they have the testimonials and a growing user base to attest to its efficacy. Fact is, the medium—text or chat—does not necessarily create a significant barrier depending on the user. When it does, the significant benefit of a human therapist over a chatbot is the ability to instantly respond and change context (to chat on the phone, say, or meet in person). Meaning, the chatbot will only respond to what a user says or types. However, the therapist can pick up on and address the motivation behind what the user is saying.

The benefit of human interaction, even via a medium like text/chat, is clear: There are no limitations to the range of response. Yes, it may be a challenge to demonstrate empathy with text, as opposed to voice, but the ability to get that across in the medium and to share true emotions and experiences, is unparalleled. Chatbots have not actually lived lives.

Again, the benefit if the user has an underlying mental illness in addition to loneliness, the therapist is trained to key in on specific words, speech patterns, and conversation tones to determine the appropriate course of action—such as scheduling an in-person session or referral to a therapist specializing in the secondary mental illness area which the user presents. 

The therapist would know to alert emergency service personnel if they believe the user/patient is suicidal, as suicide can also be caused by loneliness. 

Beyond this, there are a few key risks, especially as technology improves:

Intentional confusion or deception. Are you an actual certified therapist? Are you an actual person at all? 

Data privacy and security. Because we’re talking about “digital” data, the same risk of cybersecurity exists regardless if one speaks with a chatbot or a human. The medium is vulnerable to hackers and software errors. In a recent Business Insider article about Babylon Health, a UK-based telehealth company, confirmed a software error enabling users to view recordings of other patients’ video consultations. The data breach was made public after a user of Babylon Health tweeted that he was able to view over 50 recordings from different patients. This underscores the potential risk and the significant investment in cybersecurity that these platforms will need to make in order to ensure privacy.

In closing, the chatbot and chat therapist should not be seen as an either/or. Human beings are complex and unique. Both have their place, and users would be best served by integrating the two in order to provide a truly holistic therapeutic experience.

 

About the authors

Broderick Jones

BRODERICK JONES

VP, Healthcare & Life Sciences Consulting, EPAM Systems

• 20 years of experience in the Healthcare and Life Sciences Industry
• Led programs and teams focused on Customer & Employee Experience, Digital Transformation, Talent Strategy & Workforce/Org Design as well as Inclusion & Diversity with the goal of driving top-line growth and building capabilities for ongoing enterprise success. For work in the Life Sciences industry, garnered two patents

REGINE JONES

Regine Jones

Sr. Manager, Healthcare & Life Sciences Consulting, EPAM Systems

• 16 years of experience in the Healthcare and Life Sciences Industry
• Led programs and teams focused on Product Marketing/Advertising, Marketing Operations & Change Management
• Developed award winning Loneliness Combat Strategy and Methodology focused on  both Male and Female Breast Cancer Patients/Survivors.
• Holds a copyright for a Suicide Prevention campaign including strategy, creative concept/messaging, multi-channel communications plan, & target reach methodology.