National Eating Disorder Association yanks chatbot that replaced human helpline staff after users said it gave harmful advice

This post was originally published on this site

https://content.fortune.com/wp-content/uploads/2023/05/GettyImages-1248008861-e1685547397272.jpg?w=2048

“It came to our attention [Monday] night that the current version of the Tessa Chatbot … may have given information that was harmful,” NEDA said in an Instagram post. “We are investigating this immediately and have taken down that program until further notice for a complete investigation.”

The Chatbot was set to completely replace human associates on the organization’s hotline on June 1. It’s unclear how the organization plans to staff that helpline at this point.

The problems with Tessa were made public by an activist named Sharon Maxwell, who said: “Every single things Tessa suggested were things that led to the development of my eating disorder.” NEDA officials initially called those claims a lie in a social media post, but deleted it after Maxwell sent screenshots of the interaction, she said.

Alexis Conason, a psychologist who specializes in treating eating disorders, was able to recreate the issues, posting screenshots of a conversation with the chatbot on Instagram.

“Imagine vulnerable people with eating disorders reaching out to a robot for support because that’s all they have available and receiving responses that further promote the eating disorder,” she wrote.

NEDA introduced Tessa after the hotline staff decision to unionize following a slew of pandemic-era calls led to mass staff burnout. The six paid employees oversaw a volunteer staff of roughly 200 people, who handled calls (sometimes multiple ones) from nearly 70,000 people last year.

NEDA officials told NPR the decision had nothing to do with the unionization. Instead, said Vice President Lauren Smolar, the increasing number of calls and largely volunteer staff was creating more legal liability for the organization and wait times for people who needed help were increasing. Former workers, however, called the move blatantly anti-union.

The creator of Tessa says the chatbot, which was specifically designed for NEDA, isn’t as advanced as ChatGPT. Instead, it’s programmed with a limited number of responses meant to help people learn strategies to avoid eating disorders.

“It’s not an open-ended tool for you to talk to and feel like you’re just going to have access to kind of a listening ear, maybe like the helpline was,” Dr. Ellen Fitzsimmons-Craft, a professor of psychiatry at Washington University’s medical school who helped design Tessa, told NPR.