Theravive Home

Therapy News And Blogging

November 16, 2018
by Tina Arnoldi

The Future of Artificial Intelligence: Can Computers Learn Empathy?

November 16, 2018 11:30 by Tina Arnoldi  [About the Author]

The capabilities we have today with AI (Artificial Intelligence) are beyond what we could have imagined even ten years ago. We know computers can save time on manual tasks and increase efficiency. But can computers learn empathy?

FINE is a prototype for an AI-enabled mental health support tool used in the home. The intent is to provide empathic responses to a user’s facial expressions and share this with family members. The goal appears to be an early response to potential mental health issues, but are there dangers in AI empathy?

Alan Majer, the founder of Good Robot, works with new technologies hands on, exploring the potential of blockchain, machine intelligence, robotics, and connected sensors. He also has a degree in psychology. His main worry about tools like this “is that they may react in context-inappropriate ways. People are extremely complex, and our minds leap easily from one diverse subject to the next. That makes it incredibly difficult for AI to look at a single snapshot of a conversation or a facial expression and easily assess our state or intent appropriately.” This is a similar concern expressed by Dr Adam Hofmann in an earlier Theravive post, “Can an Algorithm identify You as Mentally Ill?”.

Majer speculates that “it might manage to [assess our state or intent appropriately] 50-60% of the time, and then be in the right ballpark another 20-30% or so, but occasionally it'll be deeply wrong as well - and often making a mistake that a human would not.”

This has implications for people who are misdiagnosed and do not receive treatment because the AI does not recognize mental health symptoms or vice versa - it identifies someone as mentally ill who may be reacting to a rough day.

In day to day interactions such as accessing customer support, being misunderstood by an AI tool, such as a chatbot, is a minor inconvenience and perhaps a source of frustration. But “when a bot promises to be a mental health tool, these mistakes should not be dismissed so lightly”, says Majer. “Generating a contextually-inappropriate response in these cases can put someone's mental health at risk.”  When it comes to the possibilities of chatbots or any AI enabled tool as a therapist, it seems they may best serve as an addition rather than a replacement.

Even diagnosing a mental health problem in person is a difficult task. Experienced professionals who conduct psychiatric assessments may have difficulty with a diagnosis. Majer notes “it's sometimes very difficult for one person to recognize and appropriately respond to a mental health issue experienced by another.... so it can be a very difficult task to begin with.“ Ginean Crawford, MFT, LPC, NCC, with Standing Ground Counseling, LLC  points out the importance of recognizing facial expressions in the counseling relationship. “A distinct frown or burrowing of the brow can communicate seemingly clear emotion,” says Crawford, but there are also micro expressions, while briefly displayed, that can reveal another layer.  An AI enabled tool can mimic the words spoken, but not the underlying nuances that are seen face-to-face.

Preeti Adhikary with Fusemachines Inc., sees some benefit of an empathy engine for initial conversations in situations “where mental health is not openly discussed, especially with children.” Caleb Backe, with Maple Holistics recognizes the advantage of AI empathy “for someone who is unable to express their emotions”. Backe also views AI as a way to help people “be understood by those around them” and Adhikary believes “the gamification of the expressions [in a tool such as FINE ] makes this intuitive/simple and not a chore.”

Adhikary is still cautious. “Machines can recognize patterns and images, not intent. This would be the con of using machines instead of humans to study behavior and signs. Will the tool make the entire process impersonal and remove the human element?” Backe shares this concern about removing a human element. “There’s something to be said for maintaining the human connection that comes with dealing with mental issues. AI can only get us so far in terms of dealing with mental health problems, human interaction still remains the next step.”

About the Author

Tina Arnoldi

Tina Arnoldi, MA is a marketing consultant and freelance writer in Charleston SC. Learn more about her and connect at TinaArnoldi.com


Comments are closed