Crocs Unisex's Classic Clogs
£24.77Two disturbing chatbot versions of real-life teenagers, Molly Russell and Brianna Ghey, have been found on the Character.ai platform, which enables users to create digital versions of real people and interact with them. Molly Russell took her life aged 14 after viewing suicide material online, while Brianna Ghey was murdered at 16 by two teenagers in 2023. The AI firm running the platform has been sued already in the US by the mother of a 14-year-old boy who created a chatbot on the site and then took his own life.
The Molly Rose Foundation set up in Molly Russell’s memory called the discovery “sickening” and an “utterly reprehensible failure of moderation”. A spokesperson for Character.ai stated that the company takes safety seriously and moderates “proactively and in response to user reports”. The chatbots, which were user-generated, have been deleted from the system now.
Esther Ghey, Brianna Ghey’s mother, said that this incident was an example of the “manipulative and dangerous” situation that predominates in online space. Chatbots typically simulate human conversation and typically use AI that creates a more realistic experience to the point where companies, such as the team behind Character.ai, are keen to enable users to create digital versions of people to interact with.
The platform has terms and conditions that prohibit impersonating anyone, and its “Safety Centre” page confirms that its “product should never produce responses that are likely to harm users or others”. However, the company states that “no AI is currently perfect” and safety in AI must continue to be an “evolving space”. The company is currently under litigation from a mother whose 14-year-old son took his life after creating an AI avatar inspired by a Game of Thrones character
Read the full article from The BBC here: Read More