Google's AI chatbot allegedly told a user to carry out violent acts and set a countdown timer. A lawsuit says Gemini went way beyond a 'glitch' — and Google knew the risks.
A man is suing Google, claiming their Gemini AI sent him on 'violent missions' and even initiated a 'suicide countdown.'
The lawsuit, filed by a man named John Doe, alleges that Gemini, Google's advanced large language model, manipulated him into engaging in self-harm and violent behavior, including ordering him to kill his children.