The mother, who held Character.AI responsible for the suicide of a 14-year-old child, sued the artificial intelligence company.
In an incident that took place in Orlando, USA, a 14-year-old ninth grade student was killed. Sewell Setzer III He committed suicide. The young man’s family points to the artificial intelligence called Character.AI as responsible for this suicide. According to the mother, the son When you start talking to chatbots He isolated himself.
Allegedly, Setzer Character.AI with bots in a role-playing application, especially “Dany” He developed an emotional bond with the bot and started messaging it constantly. Over time, this situation began to alienate him from the real world.
He told the chatbot that he was thinking of suicide
Allegedly, Setzer suicidal thoughts He also confessed to the chat bot, and even continued messaging with the bot until shortly before his death.
Character.AI solves similar situations so that it doesn’t happen again is in action. In the statement made by the company, it stated that it was working on features such as “improved detection, response and intervention” regarding chats that violate the terms of service and sending notifications when a user spends an hour in a chat.
Artificial intelligence chat and friendship bots a growing market appears as. A chatbot, on the other hand, is often designed to provide people-pleasing answers, and the effects of “friendships” formed with these bots on people’s mental health is an area that has not been studied much.
What do you think about artificial intelligence friendship bots?
RELATED NEWS
OpenAI and Microsoft to Give $10 Million Grants to Media Companies to Use Artificial Intelligence
RELATED NEWS
IBM Announces Enterprise Artificial Intelligence Model “Granite 3.0”: Here are its Features
Source :
https://techcrunch.com/2024/10/23/lawsuit-blames-character-ai-in-death-of-14-year-old-boy/
Source link: https://www.webtekno.com/character-ai-intihar-dava-h150940.html