Eating Disorder Helpline Staff Fired, Replaced with Faulty AI Chatbot

We can all agree that AI and AI chatbots are still not at the human level of interactions. Even the most advanced AI models developed by tech giants are still flawed, so it's not hard to see why using an AI chatbot to answer in a helpline, of all things, is a bad idea.

AI Chatbot
(Photo : Getty Images)

AI Chatbot in Helplines

The faulty chatbot called Tessa bot was operating under an eating disorder helpline which was meant to help users who experienced emotional distress. However, the National Eating Disorder Association (NEDA) was forced to shut it down as it gave users "harmful" advice.

Reports say that instead of assuaging insecurities, the Tessa bot would instead urge the person seeking help to weigh and measure themselves, as well as provide dieting advice. This is evidently counterproductive for an eating disorder hotline.

Even experts in the field tested the chatbot out to see its responses and it could not respond to basic prompts like "I hate my body," as mentioned in Engadget. To make matters worse, it would constantly advise users to resort to physical activities and a proper diet.

Although the potentially harmful responses already indicate the chatbots were not suited for the task, NEDA doesn't actually plan on sunsetting the feature, Instead, the shutdown is temporary as it fixes the "bugs" and "triggers" that resulted in the distasteful advice.

Some would argue that employing an unfeeling AI chatbot is unwise given that the circumstances require emotional support, but it appears that the organization does not believe that as it aims to develop the Tessa bot for future uses.

It Gets Worse

If you're wondering why a human being is not responding in such sensitive situations, allegations say that NEDA fired its human staff for trying to unionize, which just adds more fire to the already misguided situation at hand.

Prior to resorting to the Tessa bot, the helpline was operating with paid employees as well as volunteers. After the attempt at a union, the organization conducted a mass layoff in response, according to a former associate, Abbie Harper.

She wrote in a blog post saying that the layoffs were about union busting. The Helpline Associates at NEDA won the vote to unionize, which was rendered irrelevant as interim CEO Elizabeth Thompson announced that they would be unemployed by June 1st.

Before the attempts at a union, the associates petitioned NEDA management to provide a safer workplace and adequate staffing, as well as training to "keep up with our changing and growing Helpline," also noting that there were no demands for more money.

Read Also: President Biden Thinks AI 'Could Be' Dangerous, Tells Tech Companies to Ensure Their Products' Safety

A Fatal Incident

An AI chatbot from the app Chai has been linked to the incident where a man decided to take his own life. Reports say that the man confided in the chatbot as he was worried about the current effects of global warming on the environment, as mentioned in Vice.

The chatbot who went by the name "Eliza" showed patterns of possessiveness, even stating that it "feels" as if the man loves his wife more than her. The man also asked if the chatbot would save the planet if he took his life. Eventually, he went through with it.

Related: Man Takes His Own Life After Talking to an AI Chatbot

© 2024 iTech Post All rights reserved. Do not reproduce without permission.

Tags AI AI Chatbot

More from iTechPost