-
Nearly 300 pounds of meth hidden in fake solar panels nabbed at LAX - 9 mins ago
-
This new air scanner could replace drug dogs at US borders - 27 mins ago
-
How to Watch Semifinal: England vs Italy: Live Stream UEFA Women’s Euro 2025, TV Channel - 30 mins ago
-
What are the 10 Biggest College Football Rivalries of All Time? - 33 mins ago
-
Ex-MLB pitcher sleeping with family nanny guns down in-laws amid financial struggles - about 1 hour ago
-
‘Miss Independent’ singer Kelly Clarkson says dating in spotlight is difficult - about 1 hour ago
-
Merrill Kelly Trade Expected, Surging Red Sox Tabbed ‘Best Fit’ - about 1 hour ago
-
Who is Jacob Misiorowski? Rookie All-Star Chats Pokémon, Card Collection - about 1 hour ago
-
Target says it’s ending its Amazon and Walmart price matching policy after a 12-year run - 2 hours ago
-
Ex-MLB pitcher sleeping with family nanny guns down in-laws amid financial struggles - 2 hours ago
Cedars-Sinai doctors develop AI-powered mental health ‘robot’ therapist
Misty Williams checks into the emergency room at Cedars-Sinai Medical Center from time to time for treatment of debilitating pain from sickle cell disease, which causes red blood cells to stiffen and block the flow of blood.
After pain medication and hydration are ordered, the 41-year-old Los Angeles resident makes an unusual request: access to a virtual reality headset with an artificial intelligence-powered chatbot that can carry on a dialogue with her.
With the headset on, Williams finds herself in a virtual garden, butterflies drifting around her. A humanoid robot greets her with a soothing female voice.
Patient Misty Williams, wearing an Apple Vision VR headset, has a session with an AI-powered “robot” named Xaia at Cedars-Sinai.
(Genaro Molina / Los Angeles Times)
“Hi, and welcome. My name is Xaia, and I’m your mental health ally,” it says. “How can I help?”
After a session, Williams’ pain eases and her mind is calmer.
“Mentally and physically, I feel more at peace,” Williams said.
Xaia (pronounced ZAI-uh) is just one of many ways that artificial intelligence technology is barreling its way into the burgeoning sector known as digital health.
Digital health startups using AI accounted for an estimated $3.9 billion in funding in 2024, or 38% of the sector’s total, according to the digital health advisory firm Rock Health. Mental health was the top-funded clinical area, drawing $1.4 billion.
Major medical institutions in Los Angeles are embracing the trend. UCLA Health is using AI to help doctors catch strokes faster, reduce hospital re-admissions and spend more time with patients by automating medical notes, said Paul Lukac, chief AI officer.
Keck Medicine of USC plans to offer employees a commercial AI chat tool to support stress management, according to Dr. Steven Siegel, chief mental health and wellness officer.
At Cedars-Sinai, Xaia — an acronym for eXtended-reality Artificially Intelligent Ally — was designed and programmed by Dr. Omer Liran, with ideas and research support from Dr. Brennan Spiegel and therapeutic input from clinical psychologist Robert Chernoff, in collaboration with the medical center’s Technology Ventures.
Dr. Omer Liran.
(Cedars-Sinai)
VRx Health, a for-profit company founded by Liran, holds an exclusive license from Cedars-Sinai to market Xaia commercially. Cedars-Sinai and several private investors hold equity in the company.
A version like the one Misty Williams uses is available to the public via the Apple Vision virtual reality headsets for $19.99 a month. A VR version for the Meta headset is freely available only to researchers. A web and mobile version is accessible to licensed clinicians for tiered pricing between $99 and $399-plus per month, which allows them to invite patients to use the tool.
Liran, a psychiatrist, said Xaia is designed to supplement, and not replace, the services of mental health therapists amid a national shortage of providers.
“Even if somebody needs to be seen once a week, they may only get seen once a month,” he said.
The Xaia app draws from hundreds of therapy transcripts, both from real sessions and mock sessions created by experts to sound like an actual therapist.
For example, if users tell Xaia they’re struggling with a new cancer diagnosis, the robot might say, “That must be very hard for you,” then ask how it’s affecting their mood, and what they find themselves doing when they’re overwhelmed.
“Trying to stay positive when things feel so heavy must take a lot of energy,” the chatbot says. “When you notice yourself being pulled back into those difficult thoughts, what usually happens next? Do you find yourself withdrawing, or do you turn to anyone for support?”
Clinical psychologist Robert Chernoff.
(Cedars-Sinai)
So far, Xaia has been used by about 300 patients across various research studies at Cedars-Sinai, including those focused on chronic pain, alcohol use disorder, and irritable bowel syndrome, said Spiegel, director of health services research at Cedars-Sinai.
Many people with chronic illnesses also struggle with anxiety or depression, Spiegel said. Physical and emotional symptoms feed off each other, and tools like Xaia aim to help with both.
The tool isn’t covered by insurance yet, but billing codes for virtual reality therapy and digital health services do exist, and other hospitals like the Mayo Clinic are beginning to use them. VRx has an agreement to deploy Xaia at Mayo Clinic, according to VRx Chief Executive Gabe Zetter.
Xaia isn’t the only app of its kind. Woebot, a pioneering chatbot developed by psychologist Alison Darcy while at Stanford, used scripted conversations based on cognitive behavioral therapy to support users with anxiety and depression.
Though it reached 1.5 million users, the company shut down the app in July. Darcy said the company is now focused on building new tools with large language models, since AI is moving faster than regulators like the Food and Drug Administration can keep up.
In recent years, some emotional support chatbots have been blamed for deepening distress, including one incident in which a Florida teen died by suicide in 2024 after extended conversations with a chatbot.
Such incidents underscore the risks of emotionally responsive AI tools, said Todd Essig, a psychologist and founder and co-chair of the American Psychoanalytic Assn.’s Council on Artificial Intelligence.
“Even after the most loving, empathic response, an AI doesn’t care if you drive to the store or drive off a cliff,” Essig said.
AI programs learn to mimic human responses, Essig said, so it’s up to the people building them to set clear limits and ensure they don’t cause harm.

“Mentally and physically, I feel more at peace,” Misty Williams said after a session with AI-powered Xaia.
(Genaro Molina / Los Angeles Times)
When built with ethical frameworks and used under clinical supervision, tools like Xaia can support genuine therapeutic progress, functioning more like digital journals — a modern twist on the paper workbooks given to patients decades ago, said Jodi Halpern, a professor of bioethics and medical humanities at the UC Berkeley School of Public Health.
But many emotional support chatbots that aren’t clinically monitored are designed to mimic intimacy and build emotional bonds.
“People can experience the app as another,” Halpern said. “But it’s not actually giving them real-life experiences with other humans that are important for developing the healthy, mutually empathic curiosity that people need to participate in complex human relationships.”
Halpern noted there is a difference between clinically approved mental health tools and those with no oversight. She and others are supporting a California bill sponsored by state Sen. Steve Padilla (D-Chula Vista) that would require companies developing mental health chatbots or apps to disclose whether their tools are clinically validated, regulated by the FDA or rely on generative AI.
Liran said he and his partners are aware of the limitations and have built in guardrails to keep the chatbot from saying anything harmful or inappropriate. For example, one arm of the AI generates the response, and another instantly double-checks it to make sure it’s safe before letting it through to the user.
“We’re not just opening it up to the public,” Liran said, pointing out that the guided therapy version on mobile and desktop is available only through a licensed clinician right now and Cedars is testing Xaia in multiple studies. “We’re trying to be very careful.”
In a 14-person study, patients using Xaia with mild or moderate anxiety or depression opened up about a variety of topics, including a mother who passed away and fear of being laid off. For a patient who had been having night sweats since a breakup, Xaia asked to hear more about what made the relationship feel unresolved and how it affected the patient.
Some of the patients still preferred the nuance and responsiveness of a human therapist, but the medical literature suggests patients are warming up to the idea of a nonhuman therapist.
In a study published in PLOS Mental Health in February, participants were asked to compare responses written by licensed therapists and those generated by ChatGPT.
Not only did many struggle to tell the difference, they consistently rated the AI’s replies as more empathic, culturally sensitive and emotionally engaging.
Xaia’s creators see the tool as an extension of the patient-therapist relationship. It’s the kind of thing that might be useful if someone needs mental health support in the middle of the night or between sessions.
“We still need therapists — humans — to look other humans in the eye to have conversations about vulnerable topics,” Spiegel said.
At the same time, “it’s not practical to simply bury our head in the sand and say we shouldn’t do this, because AI is everywhere,” he said. “We’ll be brushing our teeth with AI before long.”
Source link