Skip to content
Stanford University School of Medicine

Stanford scholars discuss pros and pitfalls of using computer programs for mental health care

When times are tough, we're often told to talk through our troubles with a mental health professional. But, this isn't always practical advice; some people don't feel comfortable seeing a therapist or counselor, and many more cannot access or afford this kind of care.

Now, several studies suggest that talking to a conversational agent (a software program that converses with users through voice or text) can provide patients with benefits similar to those they'd gain from talking with a human mental health professional.

While a human connection might seem like a necessary part of mental health treatment, this isn't always the case. In a recent Journal of the American Medical Association viewpoint article on conversational agents, Stanford’s Arnold Milstein, MD, Jeff Hancock, PhD, and lead author Adam Miner explain that sometimes a "lack of humanness can be a strength."

As Miner explained in a recent Stanford news story:

Talking to another person about mental health can be scary and often treatment is hard to access. Conversational agents may allow people to share experiences they don’t want to talk about with another person. If successful, this technology could recognize and respond to mental health needs. People may be more honest about their symptoms.

Giving patients the ability to have personal health discussions with smartphones and other digital assistants can also increase access and ease of care. "Delivering health care when it’s most needed can make these conversational agents really effective for people," Hancock said in the Stanford story.

When asked if a conversational agent could replace human mental health professionals, though, the Stanford team was skeptical. Hancock said, "I’m not sure that it could ever be more beneficial than interacting with a human mental health professional, but they could play a role in simply being available."

Previously: Suicide, rape and other crises stump Siri and her conversational agent peersSmartphone app detects changes in mental health patients’ behavioral patterns in real time and Dr. Robot? Not anytime soon.
Photo by Sam Carpenter

Popular posts