U.S. 5th Circuit Court seeks regulation on lawyers’ AI use in legal filings

A federal appeals court in New Orleans is considering a proposal that would force lawyers to confirm whether they used artificial intelligence programs to draft briefs, affirming independent human review of the accuracy of AI-generated text or the absence of reliance on AI. in their presentations before the court.

in a notice issued On Nov. 21, the U.S. Court of Appeals for the Fifth Circuit revealed what appears to be the inaugural proposed rule among the country's 13 federal appeals courts, focusing on regulating the use of generative AI tools, including ChatGPT by OpenAI, by lawyers appearing in court.

Screenshot of the Fifth Circle Rule Source: Fifth Circuit Court of Appeals

The suggested regulation would apply to lawyers and unrepresented litigants appearing in court, requiring them to confirm that if an artificial intelligence program was used to produce a presentation, both the subpoenas and legal analysis were evaluated for accuracy. Attorneys who provide inaccurate information about their compliance with the rule may have their filings invalidated and sanctions imposed, as described in the proposed rule. The Fifth Circuit is open to public comment on the proposal until January 4.

The introduction of the proposed The rule coincided with judges across the country addressing the rapid proliferation of generative AI programs like ChatGPT. They are examining the need for safeguards when incorporating this evolving technology into the courts. The challenges associated with lawyers' use of AI gained prominence in June, as expressed by two New York lawyers. faced sanctions for presenting a legal document containing six quotes from cases fabricated by ChatGPT.

Related: Sam Altman's impeachment shows Biden isn't handling AI correctly

In October, the U.S. District Court for the Eastern District of Texas inserted a rule that went into effect on December 1 and requires lawyers to use artificial intelligence programs to “evaluate and authenticate any computer-generated content.”

According to statements accompanying the rule change, the court emphasized that “often the output of such tools may be factually or legally incorrect” and stressed that artificial intelligence technology “should never substitute abstract thinking and skills.” problem solving for lawyers.

Magazine: Train AI models to sell as NFTs, LLMs are big lying machines - AI Eye