Earlier this month, a federal judge in Mississippi delivered a ruling that was riddled with factual errors, from naming plaintiffs that didn’t exist to making up quotes from a state law to citing cases that don’t appear to exist.
As Mississippi Today reports, US district judge Henry Wingate’s baffling temporary restraining order immediately raised concerns that it had been generated by a AI.
We’ve already come across plenty of instances of lawyers getting caught red-handed using tools like OpenAI’s ChatGPT to generate AI slop that gets called out in the factually focused world of the courts.
However, seeing an appointed district judge be accused of essentially the same thing should raise some serious red flags. Judges have previously doled out fines and sanctions to lawyers who got caught using AI — but nobody seems to know with any degree of certainty what will happen when that dynamic flips on its head.
Worse yet, Wingate eventually replaced the errors-riddled order with a corrected version — which still cites a 1974 that doesn’t appear to exist, according to Mississippi Today.
“Yikes!” lawyer Eric Wessan tweeted. “The rush to issue orders like this undermines the judiciary.”
Lawyers had initially asked Wingate to clarify the restraining order, which pauses enforcement of a state law that prohibits diversity, equity, and inclusion programs in public education, leading to Wingate issuing a correction.
The original version listed the Mississippi Library Association and Delta Sigma Theta Sorority Inc. as plaintiffs, which have never been involved in litigation and don’t even have any cases pending whatsoever.
“Our attorneys have never seen anything like this,” a Mississippi Attorney General’s Office official told Mississippi Today.
Wingate has yet to publicly comment on his ruling or whether he used AI.
However, the evidence is damning. Case in point, the order misquoted the initial lawsuit and made up cases that don’t appear to exist, both hallmark signs of AI “hallucinations.”
“I actually don’t know how to explain the backstory here,” University of Miami law school professor Christina Frohock told Mississippi Today. “I feel like I’m Alice in Wonderland.”
Frustratingly, there’s a chance that we will never find out if Wingate, a 78-year-old judge who was nominated by former president Ronald Reagan in 1985, used AI.
“If an attorney does this, a judge can demand explanations, but it’s not true in the other direction,” she added. “We will probably never know what happened, unless an appellate court demands it.”
More on lawyers and AI: If You’ve Asked ChatGPT a Legal Question, You May Have Accidentally Doomed Yourself in Court