A lawyer used ChatGPT for a legal filing. The chatbot cited nonexistent cases it just made up.


Lawyer Steven Schwartz of Levidow, Levidow & Oberman has been practicing law for three decades. Now, one case can completely derail his entire career.

Why? He relied on ChatGPT in his legal filings(opens in a new tab) and the AI chatbot completely manufactured previous cases, which Schwartz cited, out of thin air.

It all starts with the case in question, Mata v. Avianca. According to the New York Times(opens in a new tab), an Avianca(opens in a new tab) customer named Roberto Mata was suing the airline after a serving cart injured his knee during a flight. Avianca attempted to get a judge to dismiss the case. In response, Mata’s lawyers objected and submitted a brief filled with a slew of similar court decisions in the past. And that’s where ChatGPT came in.

Schwartz, Mata’s lawyer who filed the case in state court and then provided legal research once it was transferred to Manhattan federal court, said he used OpenAI’s popular chatbot in order to “supplement” his own findings.

ChatGPT provided Schwartz with multiple names of similar cases: Varghese v. China Southern Airlines, Shaboon v. Egyptair, Petersen v. Iran Air, Martinez v. Delta Airlines, Estate of Durden v. KLM Royal Dutch Airlines, and Miller v. United Airlines.

The problem? ChatGPT completely made up all those cases. They do not exist.

Avianca’s legal team and the judge assigned to this case soon realized they could not locate any of these court decisions. This led to Schwartz explaining what happened in an affidavit on Thursday. The lawyer had referred to ChatGPT for help with his filing.

According to Schwartz, he was “unaware of the possibility that its content could be false.” The lawyer even provided screenshots to the judge of his interactions with ChatGPT, asking the AI chatbot if one of the cases were real. ChatGPT responded that it was. It even confirmed that the cases could be found in “reputable legal databases.” Again, none of them could be found because the cases were all created by the chatbot.

It’s important to note that ChatGPT, like all AI chatbots, is a language model trained to follow instructions and provide a user with a response to their prompt. That means, if a user asks ChatGPT for information, it could give that user exactly what they’re looking for, even if it’s not factual. 

The judge has ordered a hearing next month to “discuss potential sanctions” for Schwartz in response to this “unprecedented circumstance.” That circumstance again being a lawyer filing a legal brief using fake court decisions and citations provided to him by ChatGPT.





Source link: https://mashable.com/article/chatgpt-lawyer-made-up-cases

Sponsors

spot_img

Latest

MI5 considers raising UK terror threat level – POLITICO

Press play to listen to this article Voiced by artificial intelligence. LONDON — British intelligence chiefs are considering putting the U.K. on high alert for...

2024 LPGA Schedule Unveiled with Record $118 Million in Prize Money

2024 LPGA Schedule Unveiled with Record $118 Million in Prize Money © Scott Halleran / Getty Images Sport The LPGA Tour's 2024 schedule, boasting...

Want To Attract Top Talent? Eight Lessons To Take With You Through 2023

Though it began in early 2021, the Great Resignation and the resulting attitude shift toward the workplace was still widely prevalent throughout 2022....

Get Arsenal to win, 3+ goals, Nketiah to score and Partey carded

Arsenal take on Everton at the Emirates in the Premier League on Wednesday night with the possibility of going five points clear at...