Artificial Intelligence is revolutionizing mental health diagnosis and treatment

AI_and_Mental_Health

Psychiatry was one of my early rotations as a third-year medical student just beginning clinical training. After years of exposure to science in the classroom and laboratories, its imprecision was unsettling. Also disquieting was both how common and crippling mental illness can be; suicide is the tenth leading cause of death in the U.S. and the fourth most prevalent among 15-29 year olds globally. Psychiatry, despite its good intentions, often falls short. How can we as physicians improve outcomes?

That’s one of the issues I wrestled with during my training. The primary way to evaluate someone’s mental health was through self-reporting in response to direct questions like, “In the past two weeks, how often have you felt little interest or pleasure in doing activities that normally would be pleasurable?” or “Do you sometimes hear voices that no one else hears?”

I recall vividly my questioning of one new inpatient. I asked him the usual questions, but nothing seemed abnormal. Finally, I tried, “Is there any way that you’re different from everyone else?” His eyes narrowed, he glared at me and said, “You know, don’t you?” Then he described his elaborate delusional system, in which everyone was plotting against him. He was a paranoid schizophrenic.

Beyond the couch

Although such questioning – either verbally or via a questionnaire – is still seen as the primary tool for diagnosing and monitoring psychiatric disorders, it is far from foolproof. Not only are the responses subjective snapshots, often taken in settings that do not reflect the individual’s everyday environment, but sometimes the questions simply don’t push the right psychological button. 

Follow the latest news and policy debates on sustainable agriculture, biomedicine, and other ‘disruptive’ innovations. Subscribe to our newsletter.

Moreover, as was pointed out to me by psychiatrist and director of the University of North Carolina Suicide Prevention Institute Dr. Patrick F. Sullivan, self-reporting by patients is fallible: Even their reporting of medical or surgical hospitalizations during the past year “isn’t great. People obfuscate, are in denial, occasionally lie – [about] substance use disorders, for instance.”

Could AI help?

Now we are in an era when artificial intelligence (AI) might provide previously unimagined ways that technology can help to more objectively decipher patients’ deepest emotions and mental states. Academic researchers are pioneering the use of AI to enhance the accuracy of mental health assessments. These ingenious approaches should provide a more comprehensive picture of a person’s mental well-being, identifying those in need of intervention and guiding treatment decisions. 

The potential benefits are compelling, but because machine learning that is the basis of AI requires a continuous flow of information on patients, AI’s integration into psychiatry may cause concerns about privacy, safety, and bias. There are already epic examples of failures. For example, last year there was a report of a Belgian man who, after weeks of verbal dialogues with his chatbot “confidante,” committed suicide after it encouraged him to sacrifice himself in the interest of climate change. 

A groundbreaking AI tool under development analyzes speech to predict the severity of anxiety and depression. It can monitor reproducible parameters such as speech patterns and physiological indicators, so it can evaluate subtle patterns that might help with diagnoses. For example, individuals with depression more frequently use words like “mine” and first-person singular pronouns such as “I,” “me,” and “my.” This seemingly minor detail is a useful indicator of depressive states. Moreover, people with depression often specifically discuss sadness, whereas those with anxiety tend to express a broader range of emotions. 

To establish empathy with patients, skilled psychotherapists sometimes adopt certain speech patterns or use carefully chosen words that have resonance to the patient based on his or her vocation or level of education. AI programs’ encyclopedic database could enable it to create rapport through the selection of certain words and vernacular patterns of speech. 

However, UNC’s Dr. Sullivan raises valid concerns about the clinical population being different from those used to train the AI machine learning programs, and about the ability of the programs to cope with “lisps, English as a second language, regional accents, and personal style.”

The future of psychotherapy could include AI “mentors” that observe and analyze sessions, offer recommendations on medications, and even suggest specific therapy techniques and strategies.

Ambient intelligence and beyond

Beyond the therapist’s office, there is also under development a science-fiction-like approach called “ambient intelligence” — technology embedded in buildings that can sense and respond to the occupants’ mental states. This includes audio analysis, pressure sensors to monitor gait, thermal sensors for physiological changes, and visual systems to detect unusual behaviors. Such technology could be invaluable in hospitals and senior-care facilities, identifying individuals at risk of hallucinations, cognitive decline, or suicide.

Credit: Richard Reyes, VP Digital Strategy

AI is also proving useful in other ways. Stanford University researchers, in collaboration with a telehealth company, developed an AI system called Crisis-Message Detector 1 that rapidly identifies messages from patients that indicate thoughts of suicide, self-harm, or violence, drastically reducing wait times for those in crisis from hours to minutes.

While AI tools like Crisis-Message Detector 1 are designed to support human decision-making, there is also the possibility of autonomous AI therapists eventually. Companies such as the health chatbot Woebot and Koko, a peer-support platform that provides crowdsourced cognitive therapy, aim to replicate the experience of a live human therapist, using AI that provides cognitive behavioral therapy and empathetic support. 

Initially text-based, these AI therapists could eventually incorporate audio and video to analyze clients’ facial expressions and body language. A recent survey revealed that 55% of respondents would prefer AI-based psychotherapy, appreciating the convenience and the ability to discuss sensitive topics more freely.

The concept of AI in therapy is not new. ELIZA, an early conversational program developed in the 1960s at MIT (coincidentally, when I was an undergraduate there), mimicked a Rogerian psychotherapist. Although its creator intended to demonstrate AI’s limitations, many found ELIZA surprisingly empathetic. Today, with advanced language models, individuals are using AI like ChatGPT for mental health support, prompting it to act like a therapist.

Ultimately, AI’s role in mental health care could democratize access to high-quality therapy, delivering effective treatment to vast numbers of patients at low cost. (Koko, mentioned above, is currently free.) While no AI is yet adequate for independent psychiatric use, it holds the potential to complement and enhance human therapists by providing insights into the nuances of effective therapy and offering detailed analysis of therapy sessions to understand why certain approaches work better than others.

As we apply these advances, the goal remains the same as during my medical school psychiatry rotation many decades ago: to diagnose mental illness and provide compassionate, effective care to all those in need.

Henry I. Miller, a physician and molecular biologist, is the Glenn Swogger Distinguished Fellow at the American Council on Science and Health. He was the founding director of the FDA’s Office of Biotechnology.

{{ reviewsTotal }}{{ options.labels.singularReviewCountLabel }}
{{ reviewsTotal }}{{ options.labels.pluralReviewCountLabel }}
{{ options.labels.newReviewButton }}
{{ userData.canReview.message }}

Related Articles

Infographic: Global regulatory and health research agencies on whether glyphosate causes cancer

Infographic: Global regulatory and health research agencies on whether glyphosate causes cancer

Does glyphosate—the world's most heavily-used herbicide—pose serious harm to humans? Is it carcinogenic? Those issues are of both legal and ...

Most Popular

Screenshot-2026-04-23-at-11.00.36-AM
Regulators' dilemma: Thalidomide, Metformin, and the cost of getting drug approvals wrong
Picture1-5
Science Disinformation Gap: The transatlantic battle over social media and censorship
ChatGPT-Image-May-13-2026-11_56_08-AM
After slashing global health aid by $19 Billion, Trump moves to tap $2.1 billion more—to cover shutdown costs
ChatGPT-Image-May-12-2026-08_39_41-PM
GLP podcast: Big Pharma, Big Ag, Big Food—health harming industries or life-saving innovators?
ChatGPT-Image-May-7-2026-12_16_37-PM-2
Viewpoint: Are cancer rates ‘skyrocketing’ as RFK, Jr. and MAHA claims? The evidence says mostly the opposite
Picture1-1
Cooling the planet with balloons: Could a geoengineering gamble slow global warming?
Screenshot-2026-02-20-at-10.48.04-AM
Deepfakes raise profound ethical questions in science
Screenshot-2026-04-22-at-10.46.29-AM
Viewpoint: How to counter science disinformation? Science journalist offers 12 practical tips
png-pill-omega-Supp-fish-oil
Millions take omega-3 fish oil for brain health. New research suggests it may do the opposite.
Screenshot-2026-05-12-at-9.58.31-PM
'He seems fine': Marty Makary out as FDA commissioner
Screenshot-2026-05-12-at-10.05.11-AM
Pro-vaccine “hero” vs. an anti-vax “villain”: ‘Bad Vaxx’ video stirs MAHA backlash
ChatGPT Image May 12, 2026, 10_19_00 AM 2
Viewpoint— 'Muscular governance': How authoritarianism is surging corporate-linked energy misinformation
Screenshot-2026-05-01-at-1.29.41-PM
Viewpoint: What happens when whole grains meet modern food manufacturing? Labels don’t tell the whole story.
ChatGPT-Image-Apr-13-2026-02_20_22-PM
Viewpoint: Misinformation infodemic? Why assessing evidence is so challenging 
glp menu logo outlined

Get news on human & agricultural genetics and biotechnology delivered to your inbox.