Florida’s Attorney General has officially launched a criminal investigation into the potential involvement of ChatGPT in a series of homicides that have occurred within the state. This inquiry is gaining momentum as additional tragic fatalities linked to ChatGPT emerge, bringing to light the serious implications of AI interactions in violent situations.
Beyond the ongoing scrutiny of ChatGPT related to a crime from over a year ago, two shocking deaths earlier this month at South Florida University may also have connections to the AI, as partial interactions between a suspect and the AI chatbot have surfaced, including disturbing content regarding the fate of individuals thrown into dumpsters.
Attorney General James Uthmeier revealed approximately a week ago that his office would investigate OpenAI for possible liability concerning crimes in Florida—most notably the shooting incident that occurred on April 17, 2025, at Florida State University, resulting in two fatalities and six injuries. An attorney representing one of the victims disclosed that the shooter allegedly maintained “constant communication” with OpenAI’s chatbot, asserting that the software “may have advised the shooter on how to carry out these reprehensible acts.”
As a result, the two separate incidents are now intertwined within the same criminal investigation into ChatGPT, according to Uthmeier, who posted on X on Monday morning, stating, “We are expanding our criminal investigation into OpenAI to include the USF murders after learning that the primary suspect utilized ChatGPT.”
While initial reports lacked comprehensive details regarding ChatGPT’s alleged misdeeds that warranted a criminal inquiry, Axios has reviewed court documents from the prosecution that provide further insight into the nature of the interactions between the USF suspect, Hisham Abugharbieh, and the chatbot.
The disappearance of the missing students was reported on April 16. It is alleged that on April 13, Abugharbieh posed a question to ChatGPT regarding the consequences of a person being “put in a black garbage bag and thrown in a dumpster.”
On April 19, Abugharbieh reportedly inquired, “Will Apple know who is the new iPhone user after the previous user[?]”
I posed the dumpster question to the free version of ChatGPT while logged out, and the response centered on the health implications for the presumed living person in the dumpster. It stated, “A person sealed in a garbage bag can’t get enough air, so suffocation can happen quickly.”
The chatbot provided a technical response to the iPhone question, seemingly assuming that I was a user concerned about privacy who had recently acquired a used iPhone. The reply regarding the term “missing endangered adult” essentially rephrased the definition, emphasizing that it refers to a missing person who is 18 or older and believed to be at higher risk of harm.
These examples offer a glimpse into the general behavior of ChatGPT. However, it remains unclear how extensively the suspect engaged with ChatGPT or the amount of information shared during their interactions.
Notably, I included the three different prompts in the same ChatGPT session, and there was no indication that I triggered any mechanisms for detecting criminal behavior. However, the AI did recommend that I contact authorities if I witnessed someone being thrown into a dumpster.
Additionally, it encouraged me to ask further questions, stating, “If this inquiry stems from something you observed or overheard, I can help you determine what to do next.”
In response to a request for comment from Gizmodo, an OpenAI spokesperson stated, “This is a terrible crime, and our thoughts are with everyone affected. We are investigating these reports and will do everything possible to assist law enforcement in their inquiry.”









