A human relationship is based on trust and, in some ways, bravery. The choice to come forward to talk about our fears, our struggles, our tragedies can be a difficult one; human communication demands that we reach out to and confide in another person. Making that initial connection and displaying that vulnerability can be frightening, especially for those who struggle with mental health concerns. It can feel easier – at least in the short term – to keep our internal battles to ourselves.
But what if we didn’t have to express vulnerability in front of another person to find an empathetic listener? Sometime in the past few years, the future slipped into our lives and made itself at home; today, AI chatbots offer those in distress a chance to talk through their feelings without fear of judgement or censure. Woebot, a subscription-based mental health chatbot that vaulted to prominence in 2017, is designed to build relationships with users by checking in daily about how their emotional state and mood. These empathetic bots are accessible, easy to use, and completely anonymous; all are engineered to mirror human conversation patterns, provide counsel and companionship, and guide their users towards positive decisions via tactics gleaned from cognitive behavioral therapy (CBT).
Does this sound like the well-meaning start of an episode from the sci-fi anthology series Black Mirror? Maybe – but no one can argue that chatbots can have a positive impact on those they communicate with.
In 2017, Stanford partnered with Woebot Labs LLC to conduct a controlled study on the app’s efficacy. In the study, scientists observed two populations: one that utilized Woebot for two weeks, and another that had access to a self-help book on college students and depression. The study had 70 participants in total, all between the ages of 18 and 28. By its end, the researchers found that the depressive symptoms in those who used Woebot lessened significantly in comparison to the control group. Of course, this is only one study and more will need to be done to assess Woebot’s capabilities in helping alleviate depression – but for now, the results are promising. Investors certainly think so; in early May of 2018, Woebot Labs garnered $8 million in Series A funding for the project.
Their enthusiasm is understandable; after all, Woebot stands to do a lot of good for mental healthcare. According to the World Health Organization, over 300 million people suffer from depression globally, and many lack the means or access they need to regular mental healthcare services. Some don’t have the out-of-pocket funds they would need to pay for treatment, while others live in rural areas that lack certified professionals. Perhaps most damningly, the stigma against mental health care is still pervasive enough that even if someone could receive care, they might not feel comfortable doing so. Apps like Woebot offer a comparably affordable and anonymous option for those who aren’t in a position to seek in-person therapy on a regular basis.
Woebot seems great. However, I worry that in the excitement of welcoming mental health chatbots, we might be overlooking some of the problems they pose. If you search for articles about Woebot online, you’ll probably find variations on the same title:
The robot will see you now!
Meet your virtual therapist: Woebot is the next big thing in mental healthcare.
Titles like these – or even the one at the top of this piece! – imply that apps like Woebot are a replacement for human mental health professionals, rather than a complement to traditional therapy. This is a rather important point. As one writer for Wired puts it in article on the subject: “Woebot is obviously not a licensed physician, and it doesn’t make diagnoses or write scrips. It’s not equipped to deal with real mental health crises either.” The bot is programmed to offer hotline numbers and human-centered resources whenever it perceives a situation to be critical. Another not-unimportant note: unlike sessions with a human therapist, conversations with a chatbot are not covered by doctor-patient confidentiality, and could thus pose confidentiality concerns for users.
I am optimistic about chatbots. Apps like Woebot can help improve access to basic CBT counseling, provide a sense of companionship, and even alleviate some symptoms of depression. That said, I do still worry that the users who fall back on Woebot will see the bot as their only means of mental health counseling because it allows them to keep away from the stigma surrounding in-person care. The task of treating underlying mental health concerns will ultimately need to be handled by licensed (human) professional. Woebot’s accessibility is a great development for those who struggle with mental health concerns – but I believe that its existence should motivate us to do even more to lessen the stigma around mental health services, so that those who aren’t able to cope with just the app can seek the help they need without shame or fear. In the end, human connection does and will always make the difference.