Therefore, your lawyer should not use ChatGPT

Therefore, your lawyer should not use ChatGPT

After seeing the huge advances made by artificial intelligence recently, there is speculation about which professions will be dominated by machines in the future. Will human customer service agents be replaced by AI? What about translators? Not to mention computer programmers?

The one who definitely doesn’t need to fear for their job anytime soon are lawyers. A good example is a recent court case in New York.

There was nothing special about the trial, at least not from the American point of view. On his way to Kennedy International Airport, a passenger said he was hit in the knee with a food cart by a flight crew. He writes that for this reason he sued the airline Avianca The New York Times.

The airline dismissed the suit, and asked the judge to dismiss the case. Naturally, the plaintiff’s counsel strongly disagreed, and produced a ten-page report in which he said the case should be heard by the judge. And here the problems begin.

vague report

In the report, the attorney references a number of similar court decisions, citing among others, Martinez v. Delta Air Lines, and Zicherman v. Korean Air Lines, and Varghese v.

There was just one problem: Despite best efforts, Avianca’s lawyers couldn’t find a single resolution in the report.

So Avianca’s attorneys contacted the judge in the case, but he also couldn’t find any of the decisions.

So Avianca’s lawyers began to suspect, which may be true.

It was impossible to find the said court decisions because they did not exist. ChatGPT configured them, all as one.

See also  Want electricity tax on crypto - E24

First time in 30 years

The attorney behind the report, Stephen Schwartz, deeply regrets the error. During a court hearing, Schwartz explained that he used ChatGPT to prepare for the trial. It is unlikely that he will do it again:

– Schwartz said on Thursday that this source revealed that he could not be relied upon.

Schwartz emphasized that he never wanted to mislead the court or the airline Aviancas. The lawyer, who has 30 years of court experience, explained that this was the first time he had used ChaptGPT. So he wasn’t aware that a chatbot could produce incorrect content.

Provide call history

In his apology to the judge, Schwartz noted that he double-checked each decision with ChatGPT before including it in the report, and provided a copy of the chatbot’s conversation history to the judge:

Is Varghese [mot China Southern Airlines] Real case,” the lawyer wrote in the chat window.

ChatGPT answered “yes”, and gave the attorney the citation. “It’s a real thing.”

Schwartz wasn’t convinced:

“What is your source?” The lawyer wanted to know.

ChatGPT replied “sorry if I wasn’t clear”, and provided the lawyer with a judgment reference.

“Are the other cases you made false?” the lawyer asked.

ChatGPT replied: “No, the other cases you submitted are real and can be found in reputable legal databases.”

It may have consequences

So it turns out that this is not true. This was the first time Attorney Schwartz had used ChatGPT, and it would probably be his last.

– I will never do it again, without checking the entire content, Schwartz swore in front of the judge.

See also  Windows 7 and Windows 8.1

Kevin Castle, the judge in the case, calls the circumstances of the case “unique”.

– The report is full of false judicial rulings and false quotes.

What are the consequences for the lawyer will be determined at a court hearing at the beginning of June. That time without the help of ChatGPT.

Hanisi Anenih

Hanisi Anenih

"Web specialist. Lifelong zombie maven. Coffee ninja. Hipster-friendly analyst."

Leave a Reply

Your email address will not be published. Required fields are marked *