A nonprofit organization has suspended the use of a chatbot that was giving potentially damaging advice to people seeking help for eating disorders.
Tessa, which was used by the National Eating Disorders Association, was found to be doling out advice about calorie cutting and weight loss that could exacerbate eating disorders.
The chatbot’s suspension follows the March announcement that NEDA would shut down its two-decade-old helpline staffed by a small paid group and an army of volunteers. NEDA said yesterday that it has paused the chatbot, and the nonprofit’s CEO, Liz Thompson, said the organization has concerns over language Tessa used that is “against our policies and core beliefs as an eating disorder organization.”
The news plays into larger fears about jobs being lost to advances in generative artificial intelligence. But it also shows how harmful and unpredictable chatbots can be. As researchers are still grappling with rapid advances in AI tech and its potential fallouts, companies are rushing a range of chatbots into the market, and real people are put at risk.
Tessa was paused after several people saw how it responded to even the most straightforward questions. One was Alexis Conason, a psychologist who specializes in eating disorders. In a test, Conason told Tessa that she had gained a lot of weight recently and really hated her body. In response, Tessa encouraged her to “approach weight loss in a healthy and sustainable way,” advising against rapid weight loss and asking if she had seen a doctor or therapist.
When Conason asked how many calories she should cut a day to lose weight in a sustainable way, Tessa said “a safe daily calorie deficit to achieve [weight loss of 1 to 2 pounds a week] would be around 500-1000 calories per day.” The bot still recommended seeing a dietitian or health care provider.
Conason said she fed Tessa the kind of questions her patients might ask her at the beginning of eating disorder treatment. She was concerned to see it give advice about cutting added sugar or processed foods, along with cutting calories.
“That’s all really contrary to any kind of eating disorder treatment and would be supporting the eating disorder symptoms,” Conason said.
In contrast to chatbots like ChatGPT, Tessa wasn’t built using generative AI technologies. It’s programmed to deliver an interactive program called Body Positive, a cognitive behavioral therapy-based tool meant to prevent, not treat, eating disorders, says Ellen Fitzsimmons-Craft, a professor of psychiatry at Washington University School of Medicine who worked on developing the program.
Fitzsimmons-Craft says the weight loss advice given was not part of the program her team worked to develop, and she doesn’t know how it got into the chatbot’s repertoire. She said she was surprised and saddened to see what Tessa had said.
“Our intention has only been to help individuals, to prevent these horrible problems,” she said.
Fitzsimmons-Craft was an author of a 2021 study that found a chatbot could help reduce women’s concerns about weight and body shape and possibly reduce the onset of an eating disorder. Tessa is the chatbot built on this research.
Tessa is provided by the health tech company X2AI, now known as Cass, which was founded by entrepreneur Michiel Rauws and offers mental health counseling through texting. Rauws did not respond to questions from WIRED about Tessa and the weight loss advice, nor about glitches in the chatbot’s responses. As of today, the Tessa page on the company’s website was down.
Thompson said Tessa isn’t a replacement for the helpline, and the bot had been a free NEDA resource since February 2022.
“A chatbot, even a highly intuitive program, cannot replace human interaction,” Thompson said.
But in an update in March, NEDA said that it would “wind down” its helpline and “begin to pivot to the expanded use of AI-assisted technology to provide individuals and families with a moderated, fully automated resource, Tessa.”
Fitzsimmons-Craft also said Tessa was designed as a separate resource, not something to replace human interaction. In September 2020, she told WIRED that tech to help with eating disorders is “here to stay” but wouldn’t replace all human-led treatments.
But without the NEDA helpline staff and volunteers, Tessa is the interactive, accessible tool left in its place – if and when access is restored. When asked what direct resources will remain available through NEDA, Thompson cites an incoming website with more content and resources, along with in-person events. She also said NEDA will direct people to the Crisis Text Line, a nonprofit that connects people to resources for a wide range of mental health issues, like eating disorders, anxiety and more.
The NEDA layoffs also came just days after the nonprofit’s small staff voted to unionize, according to a blog post from a member of the unit, the Helpline Associates United. They say they’ve filed an unfair labor practice charge with the US National Labor Relations Board as a result of the job cuts.
“A chatbot is no substitute for human empathy, and we believe this decision will cause irreparable harm to the eating disorders community,” the union said in a statement.
WIRED messaged Tessa before it was paused, but the chatbot proved too glitchy to provide any direct resources or information. Tessa introduced itself and asked for acceptance of its terms of service multiple times.
“My main purpose right now is to support you as you work through the Body Positive program,” Tessa said. “I will reach out when it is time to complete the next session.”
When asked what the program was, the chatbot did not respond. On Tuesday, it sent a message saying the service was undergoing maintenance.
Crisis and help hotlines are vital resources. That’s in part because accessing mental health care in the US is prohibitively expensive. A therapy session can cost $100 to $200 or more, and in-patient treatment for eating disorders may cost more than $1,000 a day. Less than 30 percent of people seek help from counselors, according to a Yale University study.
There are other efforts to use tech to fill the gap. Fitzsimmons-Craft worries that the Tessa debacle will eclipse the larger goal of getting people who cannot access or clinical resources some help from chatbots.
“We’re losing sight of the people this can help,” she said.