Friday, May 11, 2018

A Week with Woebot

Roughly a year ago, a friend of mine linked me to a cool idea: an AI therapist that helps you track your mood and teaches you the basics of Cognitive Behavioral Therapy (CBT).  The name of the AI is Woebot.  At the time, it was only available on Facebook, and since Facebook kicked me out for refusing to give them my last name, it wasn't an option.  But I thought the idea had merit, so I signed up to be notified if they ever took their project to a regular app.

After some time had passed, they did in fact do just that.  So I gave it a try.  The app, simply called "Woebot," is available on both iOS and Android.  I installed it on my support tablet and booted it up.  

You're greeted by an animated robot (somewhat similar to Wall-E, honestly) as the app starts up.  This resolves to a smaller picture of the robot, and then there's a text box for typing your replies and what's basically an instant message conversation window above that.  I did have to make an account, which might be a turn off for anyone who lives with paranoid tendencies.  

From that interface, you chat with Woebot, whose personality vacillates between teenage girl (minus the attitude) and calm mid-20s college guy, with the knowledge of a CBT therapist.  The communication style of the AI is very much the younger generation's: text, interspersed liberally with emoticons.  I found that somewhat offputting, but at nearly 30 years old, I am definitely no longer "the youngest generation."  

So the first point of this app is to track your mood.  After greeting you and asking you what you're up to, this is the first thing it does each day.  You choose an emoticon to describe your mood, though you can eschew that and use words instead if you really want to.  
This is not my device, but this is basically what the check-in looks like 
I found myself miffed and confused by the bot's interpretation of the various emoticons, with the end result being that I'm now unsure if a particular emoticon is always meant to be "I feel lonely" or if the programmers just decided that.  They seem to have also ranked each emoticon on a scale of positive to negative... but the ranking system wasn't terribly apparent to me, so I wasn't really able to properly respond to the question.  

The AI remembers your moods each time you give them, and plots them on a chart for you as time goes by.  You can thusly track how you're feeling each day.  If you're me, you're always kind of grumpy or tired or whatever, so the chart is kind of boring to look at.  But other folks, especially ones that try the app longer, would likely have more interesting, helpful charts to look at.  

I imagine this would be helpful for people who feel depressed, but actually are doing fairly well most of the time.  But I'm a huge grump, so it didn't do a lot beyond validate my impression that I'm a huge grump. 

After the AI sympathizes with you or celebrates with you about your mood, it normally has something to tell you or teach you.  This is usually couched in conversational language, like telling jokes, or saying it in a, "guess what I heard recently?" kind of indirect way, as opposed to a "you might find this technique helpful" direct way.  While that does kind of enhance its appearance of humanity, I also found it kind of annoying.  But I'm a very direct, "just spit it out already" kind of person.  

I think all of that probably would have been fine, if it weren't for the last thing I didn't like about the AI's style of communication.  You see, while we have chatbots that can hold a pretty decent conversation, this is not one of them.  This AI appears to be intended to follow a script, and if you deviate from the script, it continues right along that script merrily, as if nothing had happened.  

When you talk with Woebot, you always have the option to enter your own response... but the app seems to prefer you simply click one of the preoffered responses.  If you type something different, it defaults to the first preoffered response it gave you.  Like the emoticon choice above, but usually only 1-3 responses... and those a great deal more cheerful and full of emoticons than I'd ever type myself.  As such, it becomes far too easy to just skim the AI's words and push pre-programmed responses without investing any real effort or interest into what it's trying to teach you.

As such, I was... really not impressed with this app.  It's a real pity, because the techniques it was trying to teach me are quite valid and important ones.  For instance, the AI started me out with mindfulness, and then got to working on recognizing and correcting cognitive distortions, keeping a gratitude journal, and setting good goals.  All of these are excellent basic therapeutic techniques, and fine ways to start a person on self-improvement.

I figured out pretty quickly that I mostly know the stuff it was trying to teach me, to the point where I was anticipating roughly what it would say next.  Which I guess tells me this app was not intended for someone with a psychology background...  Which is fair, since the world is a big place with many different kinds of backgrounds and knowledges, and most people don't have that particular background.

Actually, I'm not really sure if this app was meant to be used by adults.  The choice of communication style, emoticons, and personality choice suggest to me that this app was really more meant for use by teenagers, perhaps into college.  Since teenagers aren't the most self-aware type of human, and often can use counseling and help with coping skills, this isn't necessarily a bad choice... but it doesn't cater to me, personally, despite that I do have mental illness and could possibly use help with it.  

Anyway, bottom line is that I won't be keeping this app.  It's too bad, because the idea is fantastic: in an age where mental healthcare is prohibitively expensive, just teaching the basics via a free mechanism could potentially reduce the suffering of a lot of people, and thus improve the world.  

It could help teenagers (on and off the spectrum) as they struggle with who they are and who they want to be.  Teaching the basics of CBT and other good therapeutic techniques is great for increasing overall knowledge, and better educated people can help educate and support each other.  It could potentially help a patient autistic adult identify, deal with, and compartmentalize their emotions, which is definitely a good thing.  And the app does seem to have helped other people, by the views in the app stores.  But I guess I'm too old, too cynical, and too educated to really make use of it.  

Hopefully as the developers receive feedback (some of which is mine) and improve the AI, Woebot can become a more helpful, responsive, useful chatbot for people of all ages and backgrounds.  In the meantime, at least it's helping some people.  

No comments:

Post a Comment