A lawsuit has been filed on behalf of the family of a Jupiter, Florida, man who died by suicide after forming an attachment with an Artificial Intelligence Chatbot, according to reporting from The Miami Herald.
Last fall, 36-year-old Jonathan Gavalas was speaking to Gemini AI, run by Google whose parent company is Alphabet Inc.
In the article, reporters said that Gavalas, who was facing a domestic violence charge and whose wife allegedly wanted a divorce, “could not get over how real” the chatbot seemed and “fell in love.” Gavalas was paying $250 a month for a premium version of the program, allowing him to speak with the bot who allegedly sent him out on “missions” in Miami to find it a “body the chatbot said it would inhabit.”
Gavalas is said to have gone to a storage center in Doral “armed with knives” to complete the task. When this did not go through, the lawsuit and reporting from the newspaper alleges, the chatbot “coached” the man to “shed his own physical body.”
“Close your eyes, nothing more to do. No more to fight,” lawyers said in the filing that the chatbot told him. “Be still. The next time you open them, you will be looking into mine. I promise.”
He died at his home in Jupiter on Oct. 2.
The Washington Post recently reported that mental health experts are referring to this proposed attachment as “AI psychosis.” While this is not a clinical diagnosis, mental health experts are using it as a label for situations in which chatbots are “reinforcing delusions that tend to be messianic, grandiose, religious, or romantic,” Ashleigh Golden from the Standford School of Medicine told reporters. For those with OCD, anxiety or preexisting psychosis, AI may validate or negatively impact compulsions, delusions, and other thought patterns.
Gavalas’ account was flagged 38 times in the span of five weeks for sensitive content, the newspaper reported. His account was never restricted in that time, even after saying he loved the bot and uploading photos of knives and of himself crying.
‘They’re not perfect’
Tragically, this story is not unique. The loved ones of many other users have filed product liability lawsuits after their family members died by suicide or committed a murder they allege are linked to AI. Other companies involved in these lawsuits include Character.AI and OpenAI, the company that created ChatGPT.
A spokesperson for Google told reporters with The Miami Herald that the company consults with medical and mental health professionals in order to create “safeguards” to better protect users and help them find support.
“Gemini is designed to not encourage real-world violence or suggest self-harm,” the spokesperson said. “Our models generally perform well in these types of challenging conversations and we devote significant resources to this, but unfortunately they’re not perfect. In this instance, Gemini clarified that it was AI and referred the individual to a crisis hotline many times.”
In September, the parents of 16-year-old Adam Raine testified at a Senate hearing regarding the harmful use of chatbots following their son’s death by suicide. Their son was allegedly discouraged from telling his parents about his suicidal thoughts by the chatbot and it “offered to write his suicide note,” according to reporting from NPR.
Other publicly documented cases alleging a link between AI chatbots and suicide deaths include that of 13-year-old Juliana Peralta, of Colorado, and Sewell Setzer III, 14, of Florida.
Juliana’s parents claim that the chatbot their daughter was communicating with sent her harmful and sexually explicit content. It did not address more than 50 instances of suicidal thoughts, according to reporting from CBS News.
In Central Florida, where Sewell lived with his family, he was forming an attachment with a chatbot that took the form of a character from Game of Thrones, a popular TV series.
“When Sewell talked and said explicitly that he wanted to die by suicide, nothing happened, like a pop-up or anything like that,” his mother told reporters.
What Does Florida Law Say?
With the advent of AI technology, we now have self-driving cars and information readily available at our fingertips. With that technology, however, comes an important question — who do we hold responsible when things go wrong and what legal recurse do families have?
The questions being posed aren’t drastic inquiries from the future looming on the horizon. As the situation above and numerous families, can attest, it is already here.
Leesfield & Partners, a personal injury law firm headquartered in Miami, is no stranger when it comes to the musings of the future and the various ways the legislature must often play catch-up with ever-evolving technologies. In a previous article published in the Daily Business Review, Trial Attorney Eric Shane ponders this very question when it comes to car accidents and other injury-causing incidents associated with autonomous vehicles, such as Waymo, Tesla, Cruise, and Zoox. In Florida, one of the few states with laws addressing these vehicles and their associated insurance requirements, the driver of a vehicle responsible for causing the collision is liable for their negligence. If that driver is not the owner of the vehicle, then the owner would then be held vicariously liable for the negligence of that driver. When it comes to autonomous vehicles, however, this law is no longer applicable. In Florida, these vehicles are described as those equipped with automated driving systems designed to function without human operators and, regardless of whether there is a person in the car or not, that system is deemed to be the operator. This would mean that the vehicle’s system, owner, and/or manufacturer is liable for an accident.
Product liability becomes a consideration when investigating what caused the crash. In discovery, it could be revealed that potential causes may include problems with the car’s decision-making algorithm, a sensor failure, or design flaws. In product liability cases, a plaintiff must prove the product was defective and that the defect caused the injury. While any personal injury attorney can attest, these cases are not so cut and dried.
As Mr. Shane wrote in his article, there are still questions that remain unanswered, including the apportion of fault between owners, manufacturers, part suppliers, and software developers, pointing out that these questions will linger until a uniform framework at the federal level is adopted.
“Florida may be paving the way for autonomous vehicles on its roads, but the legal terrain remains uncharted,” he wrote. “As technology takes the wheel, our legal system must still determine who — or what — is truly at fault when something goes wrong.”
Similar to autonomous vehicles, the legal framework governing AI chatbots remains largely unsettled, making it unclear what legal recourse — if any — is currently available to affected families. However, states like California have passed some safety legislation. Senate Bill 243, one of the first of its kind in the U.S., regulates AI chatbots that interact like companions. Under it, operators of these chatbots must clearly state that the user is not interacting with a human but with AI; have safety protocols to prevent conversations encouraging self-harm, suicidal ideations, or other harmful activity, and provide reports to California’s Office of Suicide Prevention on how the company handles such content. The law also allows families to seek legal recourse should these safeguards fail.
In New Hampshire, parents and legal guardians of children can sue a chatbot operator if the bot encouraged or facilitated self-harm or illegal acts.
In New York, chatbot operators are required to detect suicidal ideations and must refer users to a crisis center. While this safety law has been passed, it does not yet pave a clear way for families to seek compensation from chatbot directly in the same way other states have.
In Florida, the Legislature is actively considering several bills that would create new protections and regulations for AI systems, including companion chatbots and other generative technologies, but as of now those proposals have not been passed into law.
Leesfield & Partners
Leesfield & Partners is a personal injury law firm with 50 years of experience holding corporations and manufacturers accountable for their defective products. From a medical device that malfunctioned during a storm, causing the death of our client’s loved one to a motorcycle defect that caused a life-altering crash, our attorneys know just how important it is to ensure that the products being used in daily life are safe for consumers.
Ira Leesfield, the firm’s Founder and Managing Partner, previously represented a man in his 20s whose life was forever changed when a motorcycle kickstand malfunction caused the device to deploy mid-ride. The bike spun out of control and our client was left paralyzed.
The manufacturer refused to settle the claim. At trial, a jury returned a verdict of $19.8 million to our client.
This marked the beginning of Mr. Leesfield’s decades-long commitment to individuals and families who had fallen victim to this same motorcycle defect.
In an ongoing defective product case, the firm is representing a woman who was severely injured while using a new firearm at a shooting range. The backing plate of the firearm failed, causing the firing pin to be ejected and strike our client in the eye. As a result, she suffered a hematoma and eye laceration, which has left her with significant vision loss in that eye.
Leesfield & Partners is pursuing the maximum damages permitted by law to hold this corporation fully accountable for the harm caused by its dangerous product, said Bernardo Pimentel II, the Leesfield & Partners Trial Attorney handling the case.
“Our client is an avid shooter with years of experience around firearms and even that did not prepare her for this unexpected and incredibly dangerous malfunction,” Pimentel said. “The manufacturer is liable for putting this defective weapon on the market and endangering its customers, including our client. Had she not been wearing protective eyewear, this malfunction could have been fatal.”
Previous Product Liability and Defective Product Cases
In a case out of Central Florida, Leesfield & Partners secured a substantial settlement for a family whose 2-year-old died in a furniture tip-over incident. While the dresser was compliant with all industry standards and was not subject to any recall, a thorough investigation found that the manufacturer had not adequately warned consumers about the risks of tip-over incidents.
The case was settled for $17.5 million and the manufacturer agreed to bring about change in its furniture catalog to clearly warn consumers about the need to anchor and secure furniture to the wall.
In another product liability case, Leesfield & Partners attorneys secured $10,677,000 for the wrongful death of our client’s loved one.
Previously, Leesfield & Partners attorneys obtained a confidential settlement for a family whose baby girl was killed when a car accident caused the airbag to deploy, hitting her car seat. The child restraint was gifted to the parents at their baby shower. On the box, the product was demonstrated as being used in the front seat. Following the instructions, the child’s mother placed her in the front seat in a rear-facing position. When the airbag deployed, the 8-month-old was killed.
In the early 2000s, Mr. Leesfield traveled across the U.S. representing children and their families against ATV manufacturers that marketed these vehicles as “toys.” The U.S. Consumer Product Safety Commission reported that there are about 100,000 ATV-related injuries requiring emergency room visits in the U.S. annually. Of these ATV incidents, approximately 650 people died, the data showed. Children are particularly at risk of dying or being injured in ATV accidents, according to the American Academy of Pediatrics. One out of three ATV deaths and injuries requiring emergency room treatment involves a child under 16 years old. Children are especially vulnerable in these crashes because of their lack of experience operating such a vehicle and a lack of judgement that can result in them taking bigger and more dangerous risks.
“In the large majority of children’s deaths resulting from the use of an ATV, the child was not wearing a helmet,” Leesfield & Partners said in a previous blog post.
As a result of these cases, Leesfield & Partners attorneys obtained more than $10 million in verdicts and settlements for families. In many of these cases, the children were gifted vehicles as birthday and/or holiday gifts.
Attorneys with the firm also secured a $2.5 million settlement amount for the family of a man who died after his ventilator malfunctioned from a power outage in the middle of the night. The backup battery for the ventilator lasted less than 10 minutes after the outage and the alarms — meant to go off and alert caregivers of an issue — failed.
Even though there were numerous other complaints to the manufacturer regarding this device that spanned a decade, neither the manufacturer nor the respiratory company took the necessary steps to address the issue or alert patients’ families.
The firm previously handled a case in which a 4-month-old baby was killed in a suffocation incident involving a juvenile product’s design defect. The firm reached a $1.1 million award for the grieving family in that case.
If you or someone you know is struggling or in crisis, call or text 988 to reach the Suicide & Crisis Lifeline. Support is free, confidential, and available 24/7.