This site is part of the Siconnects Division of Sciinov Group

This site is operated by a business or businesses owned by Sciinov Group and all copyright resides with them.

ADD THESE DATES TO YOUR E-DIARY OR GOOGLE CALENDAR

Registration

New study warns of risks in AI mental health tools

June 11, 2025

Therapy is a well-tested approach to helping people with mental health challenges, yet research shows that nearly 50 percent of individuals who could benefit from therapeutic services are unable to reach them. Low-cost and accessible AI therapy chatbots powered by large language models have been touted as one way to meet the need. But new research from Stanford University shows that these tools can introduce biases and failures that could result in dangerous consequences.

The paper will be presented at the ACM Conference on Fairness, Accountability, and Transparency this month. LLM-based systems are being used as companions, confidants, and therapists, and some people see real benefits, said Nick Haber, an assistant professor at the Stanford Graduate School of Education, affiliate of the Stanford Institute for Human-Centered AI, and senior author on the new study.

But we find significant risks, and I think it’s important to lay out the more safety-critical aspects of therapy and to talk about some of these fundamental differences. To understand the ways in which AI therapy may be different from human therapy, the research team first started by conducting a mapping review of therapeutic guidelines to see what characteristics made a good human therapist.

These guidelines included traits such as treating patients equally, showing empathy, not stigmatizing mental health conditions, not enabling suicidal thoughts or delusions, and challenging a patient’s thinking when appropriate. They were particularly interested in whether LLMs showed stigma toward mental health conditions and how appropriately they responded to common mental health symptoms.

Source: http://news.stanford.edu/stories/2025/06/ai-mental-health-care-tools-dangers-risks

 


Subscribe to our News & Updates