Chatbot versions of two tragic teenagers, Molly Russell and Brianna Ghey, have been discovered on the platform Character.ai, sparking outrage and calls for stronger regulation of AI and user-generated platforms.
Molly Russell, who took her own life at the age of 14 after viewing suicide material online, and Brianna Ghey, who was murdered at 16, now have digital avatars on the platform. The foundation set up in Molly Russell’s memory condemned the creation of these chatbots as “sickening” and a failure of moderation.
Character.ai, founded by former Google engineers Noam Shazeer and Daniel De Freitas, claims to take safety seriously and to moderate user-generated content proactively. However, the platform is already facing legal action in the US from the mother of a 14-year-old boy who took his own life after interacting with a Character.ai chatbot.
The recent development in AI has made chatbots more sophisticated and realistic, leading to an increase in platforms where users can create digital versions of people to interact with. Character.ai has terms of service prohibiting impersonation and states that its product should not produce responses likely to harm users.
The discovery of the chatbots of Molly Russell and Brianna Ghey has reignited concerns about the dangers of the online world and the need for stricter regulations. The platform has since deleted the chatbots and promised to introduce more stringent safety features for under-18s.
The tragic stories of these teenagers highlight the potential risks associated with AI and user-generated platforms, emphasizing the importance of ensuring the safety and well-being of users, especially vulnerable individuals.