Generative Data Intelligence

Eating disorder non-profit NEDA pulls chatbot for bad advice

Date:

The National Eating Disorders Association (NEDA) has taken down its Tessa chatbot for giving out bad advice to people.

In a now-viral post, Sharon Maxwell said Tessa’s advice for safely recovering from an eating disorder directly opposed medical guidance. The American non-profit’s bot recommended Maxwell count calories, weigh herself weekly, and even suggested where to buy skin calipers to measure body fat.

In reality, safe recovery is a multi-stage process that includes contemplation, compassion, and acceptance; psychotherapy; a treatment plan produced by doctors; removal of triggers; little or no focus on weight and appearance; and ongoing efforts to avoid a relapse. Counting calories and measuring body fat would appear antithetical to all or most of that.

“Every single thing Tessa suggested were things that led to the development of my eating disorder,” Maxwell, who describes herself as a fat activist and weight inclusive consultant, said on Instagram. “This robot causes harm.”

NEDA confirmed it had shut down Tessa and was investigating the software’s output. In a statement, the org said on Tuesday: “It came to our attention last night that the current version of the Tessa chatbot, running the Body Positive program, may have given information that was harmful and unrelated to the program.”

Replace the fleshy troublemakers?

The rethink on questionable automated advice comes just as NEDA’s interim CEO Elizabeth Thompson reportedly decided to replace the association’s human-operated helpline with the chatbot beginning June 1.

This isn’t really about a chatbot. This is about union busting, plain and simple

Abbie Harper – who as an NEDA associate helped launch Helpline Associates United (HAU), a union representing staff at the non-profit – alleged the decision to close the helpline, ditch its humans, and replace them with software was retaliation against their unionization.

“NEDA claims this was a long-anticipated change and that AI can better serve those with eating disorders. But do not be fooled — this isn’t really about a chatbot. This is about union busting, plain and simple,” she claimed.

Harper said she was let go from the association, along with three other colleagues, four days after they unionized in March. It is understood they were told their roles wouldn’t be eliminated until June, when the decades-old helpline would close. The HAU had tried to negotiate with the NEDA for months, and had failed to get anywhere, she said. 

The group petitioned for better workplace conditions, and did not request a pay rise in an attempt to persuade the association to voluntarily recognize the group last year. The HAU, which has joined the Communications Workers of America Union, has now filed complaints alleging unfair labor practices with the NLRB, the US’s workplace watchdog. 

“We plan to keep fighting. While we can think of many instances where technology could benefit us in our work on the Helpline, we’re not going to let our bosses use a chatbot to get rid of our union and our jobs. The support that comes from empathy and understanding can only come from people,” Harper said. 

Thompson, however, told The Register that claims the NEDA would replace its helpline service with a chatbot were untrue. She said the helpline was simply closed for “business reasons” as opposed to being replaced with a software-based service or as a result of union activity. Tessa, Thompson argued, is a separate project that may be relaunched following this debacle.

“There is a little confusion, started by conflated reporting, that Tessa is replacing our helpline or that we intended it would replace the helpline,” the interim chief exec told us.

“That is simply not true. A chatbot, even a highly intuitive program, cannot replace human interaction. We had business reasons for closing the helpline and had been in the process of that evaluation for three years.

“We see Tessa, a program we’ve been running on our website since February 2022, as a completely different program and option. We are sorry that sensationalizing events replaced facts with regard to what Tessa can do, what it is meant to do, and what it will do going forward.”

‘Transition’

Bear in mind NPR last week ran a radio piece that included a recording obtained from a virtual meeting at the end of March in which NEDA helpline staff were let go.

Geoff Craddock, NEDA’s board chair, can be heard telling associates: “We will, subject to the terms of our legal responsibilities, [begin] to wind down the helpline as currently operating … with a transition to Tessa, the AI-assisted technology expected around June 1.”

To us, it appears NEDA axed its human workers and helpline, leaving it with Tessa, which was soft-launched a year ago. This was now Tessa’s time to shine, and in the hands of the public, it bombed. The non-profit indicated to staff in March the software was replacing them, and now argues the program is instead just an alternative source of information that can’t replace people.

NEDA earlier said it would pivot to AI-assisted tech because the liability of running a helpline, with those calling in increasingly suicidal or suffering a medical crisis, was too much, and that the use of the service was surging since the pandemic. As a result, the helpline was shut down this week.

Thompson described Tessa as an “algorithmic program” and told us it is not a “highly functional AI system” like ChatGPT.

The chatbot was designed to tackle negative body image issues, and started as a research project funded by NEDA in 2018; it was developed and hosted by X2AI, a company building and deploying mental health chatbots. It’s said that eating disorder experts contributed to the bot’s creation.

That language is against our policies and core beliefs as an eating disorder organization

“Tessa underwent rigorous testing for several years. In 2021 a research paper was published called ‘Effectiveness of a chatbot for eating disorders prevention: A randomized clinical trial”‘. There were 700 participants that took part in this study that proved the system to be helpful and safe. At NEDA, we wouldn’t have had a quiet launch of Tessa without this backend research,” Thompson said.

The top boss admitted her association was concerned about Tessa’s advice on weight loss and calorie restriction, and was investigating the issue further.

“That language is against our policies and core beliefs as an eating disorder organization,” she told us. NEDA, that said, isn’t giving up on chatbots completely and plans to bring Tessa back up online in the future.

“We’ll continue to work on the bugs and will not relaunch until we have everything ironed out. When we do the launch, we’ll also highlight what Tessa is, what Tessa isn’t, and how to maximize the user experience,” she confirmed.

The Register has asked Maxwell and Harper for further comment. ®

spot_img

Latest Intelligence

spot_img