If you or someone you know may be experiencing a mental health crisis, contact the 988 Suicide & Crisis Lifeline by dialing or texting “988.”
Vince Lahey of Carefree, Arizona, embraces chatbots. From Big Tech products to “shady” ones, they offer “someone that I could share more secrets with than my therapist.”
He especially likes the apps for feedback and support, even though sometimes they berate him or lead him to fight with his ex-wife. “I feel more inclined to share more,” Lahey said. “I don’t care about their perception of me.”
There are a lot of people like Lahey.
Demand for mental health care has grown. Self-reported poor mental health days rose by 25% since the 1990s, found one study analyzing survey data. According to the Centers for Disease Control and Prevention, suicide rates in 2022 matched a 2018 high that hadn’t been seen in nearly 80 years.
There are many patients who find a nonhuman therapist, powered by artificial intelligence, highly appealing — more appealing than a human with a reclining couch and stern manner. Social media is replete with videos begging for a therapist who’s “not on the clock,” who’s less judgmental, or who’s just less expensive.
Most people who need care don’t get it, said Tom Insel, former head of the National Institute of Mental Health, citing his former agency’s research. Of those who do, 40% receive “minimally acceptable care.”
“There’s a massive need for high-quality therapy,” he said. “We’re in a world in which the status quo is really crappy, to use a scientific term.”
Insel said engineers from OpenAI told him last fall that about 5% to 10% of the company’s then-roughly 800 million-strong user base rely on ChatGPT for mental health support. …