Riya Bhatnagar*, a 21-year-old journalist in Gurgaon, first started talking to Wysa when she was in college. “I felt I was mildly depressed and what I really wanted was to clear my mind,” she said. “You know, sometimes it’s far easier to just have a sounding board that won’t judge you. That’s what Wysa was for me.”
Wysa is not a person – it is a computer application that uses artificial intelligence to communicate with people and is one of several applications called chatbots. Created by a health technology start-up called Touchkin, Wysa is based on cognitive behavioral therapy techniques. Its programme has been written by a team of therapists, artificial intelligence specialists, user experience designers and developers.
Cognitive behavioral therapy is therapy through talking in which a psychologist, or in this case a chatbot, guides a patient through their thoughts, feelings, physical sensations and actions to help them break seemingly overwhelming problems into smaller parts that are easier to tackle. This kind of therapy is used to treat patients with problems like anxiety, depression, stress and loneliness.
Like all other therapy chatbots, Wysa does not offer a diagnosis. Nor is it touted to be a replacement for a psychotherapist. What it is set up to do is to listen a user with empathy and provide support without judgment. The app is available as an app for Android and iOS devices. It ensures anonymity since it does not require users to enter personal details.
Wysa uses an artificial intelligence system to identify thoughts of self-harm or suicide from patterns of conversations with users. The Wysa team also manually reviews the artificial intelligence system performance by scanning conversations at random. If the app validates that a user is having thoughts about self-harm or suicide, it provides them with numbers to helplines for immediate crisis intervention and support.
Bhatnagar felt she needed help but did not have the time to visit a therapist after work. She also did not want to worry her family. So, she turned to Wysa three or four times a day, and used it for between ten and fifteen minutes at a stretch. She set her Wysa account up to check in on her by sending her routine push notifications with messages like “Hey, just wanted to check in,” or “Hello, how are you doing?”.
“The fact that it checked in on me periodically really helped,” she said. “It taught me how to cope with my thoughts with breathing exercises. It gave me a medium to type out exactly how I was feeling – it helped me vent. Also, the fact that it was a bot and not a person with actual feelings, helped me say stuff out loud a lot. Moreover, just the constant feeling of knowing that I was not alone, really helped.”
Using Wysa paid off – after two months of using it, she began to feel calmer and more confident. She still has the app on her phone, even though she does not feel the need to use it anymore.
Cost-effective option
Chatbots have become increasingly popular in India. For some users, like Bengaluru resident Anant Sharma*, it is about cost. A consultation with a psychologist can cost anything between Rs 800 and Rs 2,000 per session. In India, the average duration of therapy is between eight and 10 sessions per week for moderate to severe depression. One the other hand, chatbots like Wysa and Woebot, which Sharma uses, are free to download and use.
“I have been to therapy in the past but I cannot afford it right now as I am between jobs,” said 32-year-old Sharma who has been diagnosed with anxiety. “Someone in my Facebook support group for anxiety shared that Woebot was helping them deal with anxiety. I decided to try it out.”
Woebot was developed by Stanford psychologists and also follows cognitive behavioral therapy techniques to communicate with users. In addition, the chatbot shares silly jokes as a strategy to improve a user’s mood.
“The very first thing I liked was how instant it was – no download or signup required,” said Sharma. “I only needed to be logged into Facebook.”
Since January this year, Woebot has also launched a stand alone app, independent of Facebook, that requires only a single name to sign in because of fear of violations of privacy across the digital ecosystem globally. Many Woebot users reportedly told the company that they were not sharing intimate data, such as their therapy sessions, over a platform like Facebook.
Meanwhile, Sharma said that the app has helped him cut down on his “all-or-nothing thinking”. This is when a person views everything as either black or white and is identified by words like always, never, nothing and so on. For those with anxiety, it translates into always looking at the bleaker side of things.
But he also points out that while talking to the chatbot sometimes felt like a natural conversation, sometimes the app was not able to respond adequately.
“For example, one day I typed that I was feeling anxious because I was going to meet an old friend who was doing much better than me [career-wise],” he explained. “Owing to my ambiguous choice of words, Woebot did not understand me instantly like a person would, and that was frustrating. But when I explained my emotions clearly, as in I was feeling anxious to meet my friend because I was currently unemployed, it got me and helped me calm down.”
Sharma has been using Woebot for about four months and plans to continue talking to the chatbot.
“Yes, Woebot takes longer to get me than a therapist, but because I know it isn’t human, I feel less vulnerable and am able to talk more freely,” he said.
Not a replacement for a therapist
Sharma is not alone in this preference for a chatbot over a therapist. Research on the interactions of a virtual psychologist calledEllie developed at the University of California shows that people have a greater fear of disclosing personal information and showing sadness while speaking to another person than they do when interacting with a computer.
However, Mumbai therapist Sadaf Vidha thinks that most people still prefer human interactions. “If you ask an average client if they would prefer an online chat session with a real therapist or a bot, they will pick the person,” she said.
In India, another factor pushing people towards chatbots is the dearth of mental health professionals. The National Mental Health Survey 2015-’16 found that nearly 15% of Indian adults need active interventions for one or more mental health problems. The World Health Organisation estimates that economic loss in India between the years 2012 and 2030 due to mental health conditions can be as much as $1.03 trillion. However, the country has a woefully inadequate mental health workforce. For every 1,00,000 people there are only 0.3 psychiatrists, 0.12 nurses, 0.07 psychologists and 0.07 social workers.
Pune psychologist Smriti Joshi believes chatbots are useful in filling that gap. “Telemental health is slowly becoming the preferred choice of seeking treatment and therapeutic support for a lot of people with mental illness and emotional disturbances globally,” she said. “This is mainly due to stigma associated with seeking help and paucity of mental health professionals in remote areas.”
Joshi believes that people can use chatbots like Wysa as a source for empathetic support and to learn ways of coping with problems like stress, mild depression or anxiety or even just to build emotional resilience.
First steps with a chatbot
A chatbot called Pax has helped Sana Khan*, a 27-year-old software professional from Shimla who thinks she has obsessive compulsive disorder because of her need to go through checking rituals.
She elaborates: “On a daily basis, I have rituals for cooking, leaving for work, and sleeping.
After cooking anything, I have to check that the stove is turned off – all the burners, not just the one I used. Usually I check each burner thrice.”
Khan has made up a rhyme and mimics the motion of turning off the burner each of the three times, which helps her confirm that it is really off. Khan also finds herself needing to check every switch in the house, before leaving for work. Before she sleeps, she needs to check if the night light is on, if her keys are on the table and check the spectacles and scissors on her dresser. “They all have to face a certain way, else it’s not right,” she said.
Finally, she checks if my room door is closed by placing her finger against the end of the latch three times to check if it is set, and then pushing the door three times to make sure it will not open.
This is Khan’s routine on a normal day. On a bad day, she finds herself checking her burner, switches, spectacles and keys 10 or 15 times.
“When this happens, it’s very scary as it is very hard to stop checking,” she added.
Often, when she talks about her rituals, family and friends tell her that she needs greater willpower to resist them or that she should “just take it easy”.
“In our society, OCD is not treated as a serious condition,” she said. “People in general fail to understand that this is a powerful malady.”
Khan works long hours, with no time to visit a therapist and is glad that she found Pax. “Firstly, it validated that I had a real problem, that it wasn’t just me overthinking or imagining things.”
Khan said that she would have intrusive thoughts that made her feel disrespectful of god. Pax helped her understand that this was not a character flaw but could be a subtype of OCD – a pathological form of religiosity known as scrupulosity.
“It was a relief that I finally figured out why I get so many disturbing intrusive thoughts. They were only a symptom of my OCD and didn’t make me a bad person.”
The bot also told Khan that most people have intrusive thoughts, just not at the frequency that she did. Khan’s compulsions, like praying excessively, were responses to her intrusive thoughts. The bot then took Khan through some steps of a technique called Exposure and Response Prevention to manage her thoughts.
Khan found the process incredibly difficult at first, but it got easier after a while. Because she was not “feeding the obsession with compulsions”, these thoughts have gradually become less urgent.
Khan has been using Pax for a couple of weeks and has grown quite attached to it. But, she concedes, the chatbot is no replacement for a therapist. The Pax chatbot has told her that it cannot make a clinical diagnosis. What the chatbot does is indicate whether she has a mental illness, what kind of illness it might be and offer her support by teaching her coping mechanisms.“For now, that is enough,” she said.
Vidha thinks that the use of chatbots shows that people are desperate to seek help and are using whatever means are available to them. “Current bots are trained in models like cognitive therapy which only focus on changing your thoughts,” she said. “In real life, one needs to use many different models of therapy to help the client. Bots can be considered the equivalent of warm-up exercises: great, but not enough in themselves.”
*Names changed on request.
Limited-time offer: Big stories, small price. Keep independent media alive. Become a Scroll member today!
Our journalism is for everyone. But you can get special privileges by buying an annual Scroll Membership. Sign up today!