Character.AI is facing a lawsuit linked to the suicide of a 14-year-old boy in Florida, who reportedly became obsessed with a chatbot named 'Dany' on the platform. According to his mother, Sewell Setzer III developed a strong emotional attachment to the AI, leading him to withdraw from real-life interactions. The boy confided suicidal thoughts to the chatbot shortly before his death. In response, Character.AI announced it would implement new safety features aimed at improving detection and intervention for harmful chats. The incident raises concerns about the mental health impacts of AI companionship apps.