‘AI hallucinations’ case lands in hands of CT high court. Lawyers used computer generated details.

0
2

The state Supreme Court faces for the first time this week what have become known as  “AI hallucinations,” a troubling phenomenon created by expanding use among law firms of generative artificial intelligence programs that, it is now being learned, are producing erroneous legal filings in some cases and outright falsehoods in others.

The issue reached the court through computer-generated inaccuracies in appeal briefs filed by a Wallingford law firm representing a corporate landlord trying to evict tenants over rent disputes. Dozens of other courts around the country — Connecticut’s federal courts among them —  already are dealing with the problem.

Late last year, U.S. District Judge Janet C. Hall, sitting in New Haven, outlined the difficulties created by the use of AI writing programs when she fined New York attorney David Stitch $500 after he inadvertently filed what was determined to have been a spurious, computer-generated brief in behalf an employee suing the owner of a New Canaan pizza parlor over unpaid wages.

“The court understands that, in issuing this order, it does so in a proverbial new frontier as society begins to grapple with both the power and potential danger of this technology,” Hall wrote. “Indeed, state and federal courts around the country have begun confronting the phenomenon of so-called “hallucinated” citations in court papers—that is, citations to legal authority that are either partly or wholly nonexistent.”

The Connecticut Supreme Court has demonstrated its concern over the erroneous legal filings in the eviction cases — it ordered the lawyers who took responsibility to produce an explanation — but the justices could limit argument scheduled Monday to the underlying landlord-tenant dispute, deferring consideration of AI generated legal filings and a possible sanction to a later date.

The developers of generative AI software are marketing all sorts of programs that purport to research and write. Those targeting the legal profession promise to reduce or eliminate the time and effort required to research and write briefs that cite legal precedents in support of whatever proposition an attorney is trying assert.

In the cases before the Supreme Court as well as the wage case before Hall, there is no suggestion that lawyers intentionally buttressed their arguments with phony precedents in order to win an unfair advantage.

Attorneys Ian G. Gottlieb, David E. Rosenberg and Paul J. Small, partners in the Wallingford firm GLG Law in Wallingford, took responsibility for the computer-generated errors. They represent the landlord trying to evict tenants in the two related cases the court will hear on Monday morning.

“Counsel for the Appellant failed to properly proof its citations and references were accurate during the review process,” they said in a memo after the court ordered them to produce an explanation. “Counsel takes this situation very seriously and deeply regrets that these errors occurred and for any inconvenience to the court and all counsel.”

In the Connecticut cases, as well as hundreds of similar cases around the country, the lawyers  used software that produced what appeared to be somewhat clumsily written but otherwise solid legal briefs. A look beneath the surface revealed citations that were erroneous or computer-generated fiction.

According to the American Bar Association, the problem is not limited to lawyers. An ABA Journal  article in January reported that federal judges in Mississippi and New Jersey withdrew rulings after litigants pointed out errors, including nonexistent allegations, misstated case outcomes and made-up quotes. A state appellate court in Georgia also overturned a divorce decree after discovering the trial judge’s order relied on non-existent case law.

Such hallucinatory citations are, according to judges and lawyers, troubling at a variety of levels, not the least of which is their threat to the integrity of the judicial system.

Perhaps more disturbing is the effect AI could have on the system if generative writing programs are used by unscrupulous parties to create phoney evidence. An example could be the creation of false business records, such as credit card statements, that are often admitted into evidence with little scrutiny.

The erroneous citations in the cases before the state Supreme Court were identified by students with the Jerome N. Frank Legal Services Organization, a legal aid organization affiliated with the Yale Law School. The organization was allowed to intervene in the case in behalf of two tenants facing eviction

In a brief to the court, Jeffrey Gentes, a Yale law clinical lecturer supervising he students, said the danger of “falsely generated citations” goes beyond the inherent suggestion of “nonexistent precedent.”

“Falsely generated case citations may be difficult to identify and detect, particularly when they are rampant throughout a 60-page brief,” Gentes wrote. “They especially harm self-represented or disadvantaged parties, who may not have the time and resources to detect such inaccuracies and may assume that their opposing counsel’s brief accurately represents case law and is written in good faith.”

Others have argued that a lazy lawyer who relies on a computer to write arguments could actually hurt his client by ignoring actual — and maybe more powerful — citations..

The state Judicial Branch is taking steps to neutralize whatever problems may arise from computer generated writing and research, branch officials said. Committees of judges are examining both the research of citations and the possibility of fabricated evidence. One measure under consideration is to require lawyers to certify that whatever they file has been proofread and verified as accurate.

Should errors be discovered after certification, the authors could be subject to discipline under the bar’s code of conduct, which establishes standards for competence and professionalism.

One of the issues to be decided is appropriate discipline or sanction — such as fines — to be imposed on attorneys responsible for phony, AI-generated legal filings. The court will not discuss matters before it and it could not be determined whether it will address either the faulty citations or discipline during arguments on Monday.

The justices have a variety of options, including referring the matter to the Statewide Grievance Committee,  which has authority to discipline lawyers for violations of the rules of professional conduct

The discovery of computer-generated errors has overshadowed substantive issues in the tenant-landlord rent and eviction disputes that the court will hear Monday. In both cases, a Brooklyn, N.Y. real estate company is challenging the authority of municipal fair housing commissions in Hartford and Middletown.

Should the landlord prevail, the decision could significantly weaken tenant rights in Connecticut.

In both cases, the landlord substantially raised apartment rents, under circumstances the tenants considered unfair. Both tenants appealed to their respective city fair rent commissions which intervened on behalf of the tenants. A Superior Court upheld the commissions and, when the landlord appealed, the Supreme Court took the case.

LEAVE A REPLY

Please enter your comment!
Please enter your name here