ChatGPT Helped Plan FSU Shooting, Florida Officials Say

News Room
7 Min Read

In April 2025, a man opened fire on Florida State University’s campus, killing two adults and injuring six others. The shooter faces charges of murder and attempted murder. Now, Florida officials are investigating OpenAI, the creator of the chatbot ChatGPT, to determine whether the company should be criminally held responsible as well. 

Florida Attorney General James Uthmeier said in an announcement on April 9 that officials “learned that ChatGPT may likely have been used to assist the murderer” in the shooting. 

“As big tech rolls out these technologies, they should not, they cannot, put our safety and security at risk,” Uthmeier added.

On Tuesday, Uthmeier launched a criminal investigation into OpenAI and ChatGPT. 

AI Atlas

(Disclosure: Ziff Davis, CNET’s parent company, filed a lawsuit against OpenAI in 2025, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.)

Although ChatGPT and other chatbots have been involved in lawsuits over alleged involvement in deaths and harm, this marks the first time that ChatGPT and OpenAI are the subject of a criminal investigation.  

An OpenAI representative didn’t immediately respond to a request for comment.

“Last year’s mass shooting at Florida State University was a tragedy, but ChatGPT is not responsible for this terrible crime,” a spokesperson for the company told NPR.

The spokesperson said that ChatGPT “provided factual responses to questions with information that could be found broadly across public sources on the internet, and it did not encourage or promote illegal or harmful activity.”

Alleged advice on gun type, ammo, time and place

A criminal investigation is conducted by law enforcement and public officials to determine who is criminally liable for a crime. During an April 21 press conference, Uthmeier said that officials determined the criminal investigation was necessary after discovering that “ChatGPT offered significant advice to the shooter before he committed such heinous crimes.” 

“The communication between ChatGPT and the shooter revealed that the chatbot advised the shooter on what type of gun to use, on which ammo went with which gun, on whether or not a gun would be useful in short range,” Uthmeier said during the press conference, adding that the chatbot also allegedly gave advice on what time of day and what area of campus would result in the shooter coming into contact with more people. 

“My prosecutors have looked at this, and they’ve told me, if it was a person on the other end of that screen, we would be charging them with murder,” Uthmeier said. 

Sam Altman, a white man with graying dark hair, sits in front of a microphone.

OpenAI CEO Sam Altman testifies before a US Senate committee in May 2025.

Photo by Demetrius Freeman/The Washington Post via Getty Images

What’s next?

Florida law states that the “aider and abettor” is as criminally responsible for a crime as the perpetrator. However, because ChatGPT is not a person, Uthmeier said that this is “uncharted territory,” but Florida officials still want to determine if OpenAI has any culpability in the crime.   

Uthmeier said that the Office of Statewide Prosecution has subpoenaed OpenAI for multiple policies, employee information and information relating to the Florida State University shooting.

Other lawsuits

Although this is the first time ChatGPT and OpenAI have been the focus of a criminal investigation, the company and others that have developed chatbots are no strangers to lawsuits. 

The parents of a 23-year-old man who died by suicide in July of 2025 sued OpenAI late that year in a wrongful death lawsuit, claiming the chatbot worsened his depression and pressured him into suicide.

In October 2025, OpenAI announced that ChatGPT was updated “to better recognize and support people in moments of distress.”

Google’s Gemini was recently named in a similar lawsuit after the family of a 36-year-old man who died by suicide said the chatbot coached him through it. 

In response to the lawsuit, Google said, in part, that “Gemini is designed to not encourage real-world violence or suggest self-harm,” later adding: “In this instance, Gemini clarified that it was AI and referred the individual to a crisis hotline many times.”

Artificial Intelligence chatbot stock photo

Pew Research Center surveyed 1,458 US teens in 2025 and found that 64% of them used a chatbot.

Andriy Onufriyenko / Moment / Getty Images

Both lawsuits are still unresolved. 

In response to Florida’s probe, lawyers representing one of the victims of the FSU shooting said they plan to “file suit against ChatGPT, and its ownership structure, very soon, and will seek to hold them accountable for the untimely and senseless death of our client.”

A spokesperson for OpenAI told WCTV: “Our hearts go out to everyone affected by this devastating tragedy. After learning of the incident in late April 2025, we identified a ChatGPT account believed to be associated with the suspect, proactively shared this information with law enforcement and cooperated with authorities. We build ChatGPT to understand people’s intent and respond in a safe and appropriate way, and we continue improving our technology.”

If you or someone you know is in immediate danger, call 911. If you’re struggling with negative thoughts or suicidal feelings, resources are available to help. In the US, call the National Suicide Prevention Lifeline at 988. 



Read the full article here

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *