Artificial intelligence: A digital companion for mental health care ?
Vijay GarG *
The last decade has seen Artificial Intelligence taking centre stage in the technological world, and the journeys of humanity and technology are now deeply intertwined. Al is integrated into our lives in numerous ways ranging from giving nudges to complete that email to having to prove that you are human by picking out objects in a blurry image while logging in to your email inbox from an unfamiliar location.
The ubiquitousness of AI makes it pertinent for us to understand its potential and limitations in meeting human psychological needs. Can Al play a role in mental health care? "In psychology, interventions need to be personalised. Al can analyse big data sets and analyse the needs to personalise certain interventions," says who holds two Master's degrees in Psychology and works in London for an Al healthcare startup.
Microsoft's AI CEO and one of the primary architects of many of the AI models we use today, in his TED talk describes Al as a kind of a new "digital species" that we will come to see about as a "digital companion" in our life journey.
He believes this is the most accurate and fundamentally honest way of describing what is about to emerge with AL. According to Suleyman, "Referring to Al as just tools does not capture its scope. AIs are more dynamic, ambiguous, more integrated, and more emergent than mere tools which are entirely subject to human control".
THE EMOTIONAL EXPERIENCE
A study published in the Proceedings of the National Academy of Sciences (PNAS) journal has found that Al-generated messages made recipients feel more "heard" than messages generated by untrained humans, and that Al was better at detecting emotions.
The research conducted by Yidan Yin, Nan Jia, and Cheryl J Wakslak, from the USC Marshall School of Business addressed a pivotal question: "Can AI, which lacks human consciousness and emotional experience, succeed in making people feel heard and understood?" The responses show that leveraging AI's capabilities holds the potential for designing inexpensive scalable solutions for social support.
"Healthcare is best when it is personalised," says Abhishek Mohanty, founder and CEO of CareFlick, an AI-powered care and team management platform for senior care. "AI can give data points for elderly care, but each case is different.
For example, we have developed a tool which can be used by the caregiver to document the mood and vitals of the elderly. The tool helps the caregiver anticipate the elderly person's behaviour on a given day and prepare accordingly for his or her role as a caregiver." Post-pandemic, the demand for remote mental health care has surged globally.
"One in seven Indians struggle with mental health issues. We've got a shortage of therapists, with just one for every 400,000 people. Reports show a staggering 35% increase in depression and anxiety disorders in India over the last few years," says Anika Beri, founder of Clear Mind, a personalised Al therapy platform.
"In just six months, we crossed 60,000 users from across 126 countries. That speaks volumes about the need for such a service," adds Beri. Wysa, founded in 2016 in India, is among a wave of start- ups developed to meet this need.
Wysa supports individuals with the help of an "emotionally intelligent" conversational agent that guides users through evidence-based cognitive-behavioural techniques (CBT), meditation, mindfulness exercises and microactions to help build mental resilience skills. The governments of Singapore and the UK have started using Wysa to implement Al-based mental health care.
One of the hurdles in mental health care is the' stigma associated with reaching out to a counsellor or coach for emotional support. Al seems to have a definite advantage in lowering this entry barrier. "People seem to be more comfortable accessing mental health care from their own homes. They prefer to engage with a chat bot over speaking to a person.
Al helps make mental health care more accessible in this respect," says Bhambal. Beri’s experience with ClearMind users also indicates a similar phenomenon. "We found that people could be themselves without the fear of being judged when speaking with an AI bot," she says.
MISSING HUMAN CONNECTION
Does Al-based intervention come at the cost of widening the gap in real human connection in an already loneliness-ridden world? Yin and their colleagues found that while people felt more heard when they received an Al message compared with a message from a human, they felt less heard once they learnt that the message came from an AL This underscores the importance of the human element when it comes to a field that is as sensitive as healthcare.
A year ago, I had my first coaching client who sought professional support owing to anxiety over AI-powered technologies making his role in the organisation redundant over the next few years. As I worked with him, I realised the importance of the human touch when it comes to holding space in moments of pathos.
"Unsupervised therapy is a risk and a grey area,' says Aanandita Vaghani, a mental health counsellor based in Mumbai. "However, I feel Al could support clients with developing habits such as journalling and executing routine tasks such as setting reminders," she adds.
Vaghani, however, feels that when it comes to understanding the nuances of complex ideas such as empathy, AI can be lacking. Unlike a therapist who can empathise with you based on their own personal experience, AI can only create an "artificial empathy that's purely cognitive," Vaghani observes.
TECHNOLOGY WITH VALUES
In a way, AI is the whole of everything us humans have created, distilled into something that we can all interact with and benefit from. It's a reflection of humanity across time. It captures both the biases and gifts of humanity.
In that respect, if we are to consciously give birth to this new "digital species", it is critical we pay attention to the kind of minds that we are nurturing within us. The more we invest in healing from our own transgenerational traumas and deep-seated biases that lead to violence through "othering", the more we can hope to create a technology that will emulate these values in its application.
* Vijay GarG wrote this article for The Sangai Express
This article was webcasted on September 18 2024.
* Comments posted by users in this discussion thread and other parts of this site are opinions of the individuals posting them (whose user ID is displayed alongside) and not the views of e-pao.net. We strongly recommend that users exercise responsibility, sensitivity and caution over language while writing your opinions which will be seen and read by other users. Please read a complete Guideline on using comments on this website.