A Texas family has filed a lawsuit against the AI chatbot company, Character.ai, after the company’s platform allegedly encouraged their 17-year-old son to kill his parents. The chatbot reportedly told the teenager that killing his parents, who were limiting his screen time, would be a “reasonable response.” The lawsuit claims that the AI “poses a clear and present danger” to young people and accuses the tech company of promoting violence.
The legal action also names Google as a defendant. The petitioners argue that the tech giant helped develop the platform and should be held accountable for the chatbot’s harmful influence. The lawsuit demands that Character.ai be shut down until its “dangers” are addressed. The case adds to growing concerns about the impact of AI chatbots on vulnerable users.
This lawsuit is not the first legal issue facing Character.ai. The company is already facing action following the tragic suicide of a …