Chatbots & Mental Health Care: What Is AI Therapy & Does It Work?

Facebook
Twitter
LinkedIn
Email
Print

Everyone’s heard about how artificial intelligence is changing the technology and content creation industry—but what about science? Artificial intelligence is already seeing use in fields such as physics, for data analysis, modelling, and model analysis. It’s even being used in psychology and therapy as a mental health care tool through chatbots.

 

If you’re working as a psychologist or enrolled in a Mental Health Counseling Masters program online—it’s important to stay current on advancements in the field. Artificial intelligence is game-changing for therapy and counseling, and could make it more accessible to the public. AI tools could also possibly help with administrative tasks, assisting with assessments, and aiding in training new clinicians.

 

We’ve already seen educational institutions explore the use of artificial intelligence tools, for instance, ChatGPT, in classroom settings for learning. The possibilities for AI software are near-endless, from the research side to teaching as well as finding practical uses in real-world applications.

 

Understanding artificial intelligence

If you’ve been on social media for the past few years, you can’t avoid seeing how artificial intelligence is going to change everyone’s lives. Before we talk about the potential of artificial intelligence and tools, especially in mental health care—it’s important to understand what it is. 

 

Artificial intelligence is described as the technology that allows computers, digital devices, and software to learn, read, write, create, and analyze—much like a human. The main goal of AI is to simulate human intelligence. The technology has been rapidly advancing in the past few years, with all major tech companies such as Google, Apple, and Microsoft investing in machine learning and AI.

What is AI Therapy?

AI therapy describes the concept of providing therapy or counseling sessions through technology that utilizes artificial intelligence—for example, a chatbot. AI-based tools are making headway in every industry, so it’s no surprise that psychologists and therapists are looking for ways to apply them to their work.

 

While it isn’t a replacement for seeing an actual psychologist or therapist, ChatGPT currently offers tips and advice. For example, if you tell ChatGPT you’re currently feeling overwhelmed or anxious, it’ll provide recommendations—such as working on relaxation, meditation, and breathing exercises.

 

The difference between ChatGPT and an AI Therapy tool is that the latter would be designed specifically for providing mental health care support to the user. Currently, few AI Therapy tools exist—good examples being Mindspa, Earkick, and Youper.

 

Features may vary across the different AI Therapy tools. For example, Mindspa features a simple AI chatbot that asks you questions throughout your day for journaling and reflection. In comparison, Wysa’s chatbot is a bit more in-depth, and its responses draw from cognitive behavioral therapy, dialectical behavior therapy, and motivational interviewing techniques.

Does AI Therapy work?

As the technology is new, there is still little information on the efficacy of AI therapy. Few studies have been done, however, there is a Forbes article reviewing different AI therapy options. Their review broke down their costs, availability, features, and user experience.

 

The article also references an academic study that’s available on the National Library of Medicine website, which researches the outcomes of AI therapy tools. The results from the study found that AI therapy tools had high user satisfaction, and engagement, as well as positive retention rates. 

 

An academic article from Francesca Minerva and Alberto Giubilini also covers the possibility of using AI as a tool for mental health care—stating that while it’s still in its early days, it could have a positive effect in increasing accessibility. It also brings up that artificial intelligence could also be used in other areas of therapy and psychology, for instance diagnosing clients, and analyzing behavioral patterns.

Accessible and affordable mental health

The United States is currently facing a mental health crisis—with an increase in suicide rates over the past decade, especially among youth, as well as substance abuse, and an increase in mental illness. From 2019 to 2020 alone, 20.78% of adults were experiencing a mental illness—that’s over 50 million Americans.

 

In 2020, less than half of the adults and youth who were experiencing mental health conditions got treatment over the year. The statistics for people with substance abuse disorders were less than 10%. There is a clear issue in the United States and worldwide for accessible and affordable mental health.

 

The reason for accessible and affordable mental health comes down to a few reasons—such as the cost of living going up, long wait times, and provider shortages in an area. While it isn’t a permanent fix, artificial intelligence therapy could provide a good starting point for people looking for resources, and assist with providing basic mental health care.

Is AI Therapy a good alternative?

Artificial intelligence chatbot tools geared for mental health care and counseling are still in their early stages and aren’t without faults. Quite a few doctors and psychologists have already voiced their concerns AI therapy and areas that need improvement.

 

For example, impersonal treatment is a common fault brought up. Artificial intelligence often lacks the human connection and empathy that a professional therapist or psychologist provides. It also lacks any cultural or racial understanding, which limits the insight that it can provide.

Another issue is that “AI therapy is only as good as what it is exposed to.” Any artificial intelligence chatbot replies with information based on the input from the user—which leaves potential room for error. The chatbot doesn’t have the potential to pick up on, for instance, nonverbal social cues, or can misinterpret user input—which could result in inappropriate advice.




Popular Stories