Snapchat’s AI chatbot, “My AI,” is facing potential closure in the UK following accusations of failing to assess the potential privacy risks it poses to users, especially children, according to the country’s data watchdog, the Information Commissioner’s Office (ICO). The watchdog has issued a preliminary investigation, warning Snapchat about compliance with data protection rules, including the Children’s Design Code. The ICO is concerned about the potential privacy risks for users between the ages of 13 and 17. While Snap said it is scrutinizing the preliminary findings, it also highlighted the robust legal and privacy review it carried out before My AI went public.
Snapchat’s parent company, Snap, developed the AI-powered chat function earlier this year, using ChatGPT, an online AI tool that realistically imitates human-like responses. Snap says My AI is an experimental and friendly chatbot, designed to be a personal sidekick to each Snapchatter who chats with it. It can be used to plan day trips or create menus and gets more than two million chats per day happening on the app.
However, Snap has been criticized for being obscure over whether My AI can access private information, such as location data, and whether younger users understand the implications of data collection. The ICO is considering adopting a final enforcement notice, which would prevent Snap from offering My AI to UK users until the company carries out an adequate risk assessment.
In conclusion, the UK’s data watchdog suspects that Snap failed to evaluate My AI’s hidden privacy risks for children and other users before launching the tool. The ICO’s code contains 15 online service standards that platforms need to follow to protect children’s data. While Snap is examining the watchdog’s preliminary notion, the company reiterated its commitment to protect users’ privacy and work constructively with the ICO
Read the full article from The BBC here: Read More