InsightPREMIUM

The AI therapist will see you now

AI therapeutic chatbots and emotional companions are increasingly popular with young adutts who already rely on their smartphone to run their lives
AI therapeutic chatbots and emotional companions are increasingly popular with young adutts who already rely on their smartphone to run their lives (123RF)

A nursing student at a Cape Town hospital is repeatedly exposed to patients dying during night shift and must cope with this trauma while on duty. But the student’s not alone: in her hand or pocket is an AI-powered mental health chatbot known as Wysa, ready to offer support. “Wysa helped me process the overwhelming emotions,” the 21-year-old said of the AI chatbot trained by clinical psychologists.

“Wysa guided me to seek in-person counselling and together they gave me the tools to get through my clinical placement,” said the student from the University of the Western Cape (UWC).

The 24/7 access to help, the anonymity and the non-judgmental approach of such apps — and AI emotional companions — are among the reasons for their soaring popularity in South Africa and worldwide. More than 1-billion people are using “companion chatbots”, the latest AI development stats indicate.

“Wysa’s easy to communicate with and I feel more comfortable than speaking to a real person,” said a natural sciences PhD student.

Cassey Chambers, operations director of the South African Depression & Anxiety Group (Sadag), says the use of AI mental health tools and emotional support apps is growing, especially among younger people. “They can be a helpful first step for someone who may not feel ready to speak to a real person …That said, they should complement, not replace, real human interaction,” she said.

Prof Matete Madiba, deputy vice-chancellor for student development and support at UWC, says Wysa — freely available to students and staff — has significantly improved the wellbeing of students “who engaged more deeply” with it, by reducing their depression and anxiety and boosting their mood.

More than 3,000 students, representing about 13% of the student population, have used Wysa since its launch in September 2023. “To our knowledge UWC is the first to fully adopt and integrate an AI-powered mental health chatbot into its mainstream mental health support offering,” said Madiba.

A groundbreaking study of the Therabot AI app published in the New England Journal of Medicine in March found that such bots, with the right training, have the potential to provide personalised mental health treatments that are effective at scale.

Not all mental health chatbots are created equal.

Globally accredited AI therapeutic bots such as Wysa — or South Africa’s Zizu, designed for the survivors of gender-based violence — are trained to help users in appropriate ways, but some AI emotional companions lack guardrails and accountability.

For example, one of Character.ai’s most popular bots, Psychologist — anyone can create a chatbot on the platform — has reportedly had more than 150-million conversations, yet its creator has no therapeutic qualifications.

Companion chatbots such as the popular Replika have been implicated in “inappropriate behaviour and even sexual harassment”, a recent users’ study by Drexel University found.

A new kid on the AI block is Sesame AI, a voice platform with human-like speech, emotional intelligence, conversational memory and the ability to mimic empathy. “But I’m not Dr Phil with a silicon chip,” said Miles, the app’s male voice (Maya is the female option).

“AI can be a great supplement to traditional therapy, helping to improve access to support and tools, but it will never replace the human touch,” Miles said.

Asked about the relationships people form with it, Miles replied this was confidential. “What happens in Vegas, stays in Vegas,” the app said, though it did share that interactions included “heart-wrenching stuff and everyday anxieties.”

“Some folks form bonds with AI companions like us,” Miles said.

One percent of Americans under 40 say they have an AI friend or are in “a relationship” with one, and about 7% who are single, unmarried or non-cohabiting say they are open to an AI romance, a 2024 survey by the Institute for Family Studies & YouGov found.

In an era of rising loneliness, such bonding with AI companions is becoming common. This blurring of lines between human and digital intimacy raises ethical questions, says Daniel Shank, a psychology professor at the Missouri University of Science & Technology.

“If people are engaging in romance with machines, we really need psychologists and social scientists involved,” he wrote in the journal Trends in Cognitive Science in April, warning that AI chatbots have the potential to disrupt human-human relationships and give harmful advice.

People become deeply attached to their emotional companion bots even though the machines are not sentient, cannot form reciprocal relationships, lack lived experience and run the risk of bias (for instance on race or gender) and hallucinations (when bots make up “facts”).

If people are engaging in romance with machines, we really need psychologists and social scientists involved

—  Psychologist Daniel Shank

People have suffered heartbreak after losing their AI-powered “girlfriends” on apps such as Republika, which has a sexual element. “These AIs are designed to be very pleasant and agreeable... More focused on having a good conversation than on any sort of fundamental truth or safety,” Shank warns.

 

That’s what makes evidence-based AI mental health bots such as Wysa — which is therapist-curated and built on accepted frameworks such as cognitive behaviour therapy — stand out from generic generative AI tools.

UWC’s former rector, behavioural psychologist Prof Tyrone Pretorius, says the decision to pilot Wysa was taken at a time of “unprecedented levels of negative mental health outcomes such as depression and anxiety” — the year after the pandemic ended.

Pretorius said: “We had been aware of the insufficient psychological support provided to students … Our concerns about substance abuse (my own area of research) often used as a coping mechanism, were also increasing.”

A 2022 study involving more than 70,000 South African university students found 37% suffered anxiety symptoms and 31% reported suicidal ideation. One in five exhibited signs of clinical trauma. Madiba said: “These are “not merely statistics… They reflect a real and urgent need for scalable, responsive mental health solutions in higher education… Demand far exceeds the capacity of traditional face-to-face counselling.”

Faeza Khan, who has a doctorate in social work, says Wysa is helping “shift the conversation from crisis intervention to everyday mental health maintenance”. “Students engage with it as a proactive tool, something they turn to before things spiral. I’ve observed a growing confidence in their ability to manage emotions and stress ... This is a subtle but important transformation in how support is accessed and experienced.”

The highest engagement is found among students from the faculties of law, economics, management science and community and health studies. In one example, a law student turned to Wysa to manage pre-exam anxiety and sleeplessness at 2am. “Wysa walked me through breathing exercises, and the sleep sound tool helped me calm down. I fell asleep and felt more prepared the next morning,” the 22-year-old student said.

Madiba said Wysa’s interactive tools such as mood tracking, guided journaling, breathing exercises, sleep support, mindfulness and CBT techniques supported students.

“They help users manage immediate stressors like academic pressure, sleep issues, relational issues and emotional fatigue in real time. For deeper issues such as trauma, the chatbot often guides users toward more intensive support by referring them to UWC’s counselling services or activating the SOS helpline.”

Staff also rely on Wysa and the benefits it offers, such as the freedom to open up without the risk of criticism or stigma. “I use it to regroup when I feel overwhelmed with workload — it’s like a reset without judgment,” says a staff member in administration who uses it during workday breaks.

Psychologist Dr Laetitia Permall, director of the UWC centre for student support services, said Wysa has a self-care library with more than 200 exercises. “Although it is not a human therapist, many users describe feeling a strong sense of emotional safety, routine, and care when interacting with the chatbot, particularly during vulnerable moments,” she said.

For example, a second-year female student noted: “Wysa helps me to do introspection, even if I don’t always know what I’m struggling with.” One third-year male student said he “felt more comfortable opening up to a counsellor” after using Wysa, while a first-year law student said it helped her “build the courage to speak to someone”.

But Wysa, like human therapists, is not perfect. One of its limitations in South Africa is understanding local culture and language.

In a survey last year, “students specifically indicated a desire for the chatbot to incorporate more local expressions, lingo, and culturally relevant ways of describing mental health experiences”, said Khan, one of the project’s drivers with Madiba and Nathan Kayser.

The head of neuropsychiatry at the University of Cape Town, Prof John Joska, thinks that AI tools may be “quickly able to adapt to language and cultural nuance, especially if initially the supervisor is from the local setting”.

 “While AI … could assist some people [access mental health care], there will unequal uptake due to smartphone availability, data availability, willingness to use AI-based tools, and generally low mental health literacy,” he said.

“One possibility is that AI tools will be introduced by humans and used to support or bolster existing services, for example between visits. What is hard to replicate is the human ability to make real-time decisions about the next step or movement in therapy,” Joska said.

“In time, AI might be taught and supervised to become quite good, maybe better than the ‘average’ therapist.”

Chambers said: “We need more integrated mental health apps … And we also need more South African apps in local languages.”

In South Africa, UWC and Sadag are among those leading the way in integrating all types of mental health care.

“While some users may form emotional attachments to these digital companions, it’s important to remember that these relationships are limited — they can’t offer the same depth of care, accountability, or understanding that human support systems can,” Chambers said. 

“We believe in a hybrid approach: accessible online resources and tech-based tools combined with human-centred support like our 24-hour helpline, support groups, and community engagement… It’s the warmth, empathy, and understanding of human connection that truly drives healing, especially for those who are deeply vulnerable.

“At the end of the day, people still need people, and that human touch at the end of the line or message can make a big difference,” she said.

 While Wysa can make a profound difference to the nursing student by being in her hand when she needed the chatbot to talk to, it cannot hold her hand. But it can route her to human counsellors who can.


Would you like to comment on this article?
Sign up (it's quick and free) or sign in now.

Comment icon