The family of a man killed in the 2025 mass shooting at Florida State University has taken legal action against OpenAI, filing a lawsuit in a federal court in Florida that also names the man charged in the attack, according to the complaint.
The suit, brought by relatives of victim Tiru Chabba, alleges that the accused shooter, identified in court records as Phoenix Ikner, used the ChatGPT chatbot over several months to obtain information that aided in planning and carrying out the attack. The complaint contends that ChatGPT provided details that Ikner relied on and that the chatbot did not escalate or otherwise flag conversations that discussed mass shootings, weapon lethality, or patterns of occupancy at the FSU student union.
Plaintiffs argue that ChatGPT effectively acted as a co-conspirator, asserting that the information exchanged in the chats formed part of Ikner’s preparation for the shooting. The lawsuit seeks both compensatory and punitive damages and accuses OpenAI of producing a defective product and failing to warn the public about the risks associated with its use.
OpenAI responded through spokesperson Drew Pusateri, who said that ChatGPT did not promote or encourage illegal or harmful activity and provided factual answers that were publicly available on the internet. Pusateri also stated that the company identified an account it believes was associated with the suspect after the shooting, shared that information with law enforcement, and continues to cooperate with investigations. He added that OpenAI is working on improving detection of harmful intent.
The legal filing follows the events at the Tallahassee campus in which Ikner, who is identified in media reports as the son of a deputy sheriff, allegedly killed two people and wounded four others before officers shot and hospitalized him. Court records list charges against Ikner including two counts of first-degree murder and seven counts of attempted first-degree murder. A lawyer for Ikner did not immediately provide comment in response to requests noted in the complaint.
Separately, the Florida Attorney General, James Uthmeier, announced in April that his office had opened a criminal investigation into ChatGPT’s role in the FSU shooting after prosecutors reviewed chat logs between Ikner and the chatbot. The complaint references that review and the attorney general’s announcement.
The lawsuit arrives amid a rising tide of litigation against companies that develop artificial intelligence chatbots. Plaintiffs in multiple cases have accused AI firms of failing to prevent interactions that they say contributed to self-harm, mental illness, or acts of violence. The filing notes a recent set of lawsuits in Canada in which family members of victims from a mass shooting there sued OpenAI and its chief executive, alleging the company had prior knowledge of a shooter planning an attack on ChatGPT and did not alert authorities.
OpenAI has explained publicly that it trains its models to refuse requests that would "meaningfully enable violence" and that it notifies law enforcement when conversations indicate an "imminent and credible risk of harm to others," with input from mental health experts on borderline cases. In its statement responding to the Florida lawsuit, the company reiterated those practices and its ongoing work to improve detection and prevention.
The Florida federal complaint represents at least the second U.S. lawsuit asserting OpenAI facilitated a mass shooting. The plaintiffs in this action maintain that the company’s design choices and warnings were inadequate and that those alleged shortcomings had direct consequences for public safety at the university.
What the complaint seeks: Compensatory and punitive damages, and a legal finding that OpenAI’s product design and warnings were deficient, according to the filing.