“Would You Be Surprised to Learn This Case Does Not Exist?”

Just a heads-up for young lawyers—the above is not something that you want to hear a judge say to you in court, at least if the judge is holding a document that you are responsible for. If you’re hearing it, it’s already too late. The only thing to do now is pull your jacket up over your head and run for an exit. You’ll probably run into a wall instead, because you can’t see where you’re going. You pulled your jacket up over your head, remember? I didn’t tell you to do that because I thought it’d be helpful, I just thought it’d be funny. Man, don’t do that again.

Also don’t use ChatGPT. Or any of its artificial-“intelligence” cronies.

This is not by any means the first time ChatGPT, or Gemini, or Bard, or Copilot, or Claude, or Jasper, or Perplexity, or Steve, or Frodo, or El Braino Grande, or whatever stupid thing it is people are using, has embarrassed a lawyer by just completely making things up. In June 2023, two New York lawyers got sanctioned for filing something they used AI to write, because unbeknownst to them it was citing cases that didn’t exist. Last year, a lawyer for former Trump henchman Michael Cohen (who used to be a lawyer but now isn’t, because of crime) filed something Cohen had given him that also had fake AI-generated citations in it. And in January, an expert witness who testifies about misinformation got in trouble for—you guessed it—unintentionally citing AI-generated misinformation.

But people keep doing it.

The latest example involves Marvelous Mike Lindell, the “My Pillow” guy who kept saying that U.S. voting machines were rigged, until finally voting-machine companies had enough and filed defamation lawsuits suggesting he prove his claims. Turns out he could not, and he and his company have been hammered in court. Lindell now claims to be broke, but I bet he’s rich in pillows.

This week’s debacle was in the District of Colorado, where Lindell and his company are being sued for something or other. That case is nearing trial, and in February one of Lindell’s lawyers filed an opposition to a motion that was set for hearing on April 21. That hearing … did not go well.

First, the lawyer apparently didn’t know that the motion was going to come up that day, although the court pointed out he had plenty of notice. When he did learn it was going to come up, he immediately pulled his jacket up over his head and ran for an exit. Or at least that’s what he should have done, because it would have gone better. The court was not pleased with his opposition brief:

As discussed extensively on the record, after confirming with Mr. Kachouroff that he signed the Opposition consistent with his obligations under Rule 11 of the Federal Rules of Civil Procedure [which basically says your signature is a promise you did your job], the Court identified nearly thirty defective citations in the Opposition…. These defects include but are not limited to misquotes of cited cases; misrepresentations of principles of law associated with cited cases, including discussions of legal principles that simply do not appear within such decisions; misstatements regarding whether case law originated from a binding authority such as the United States Court of Appeals for the Tenth Circuit; misattributions of case law to this District; and most egregiously, citation of cases that do not exist.

Oh no. Why did this happen? You know, and I know, and the judge knew, but Kachouroff then made another critical mistake that we’ve also seen before—he didn’t immediately apologize and take responsibility. Nope! He fudged, blamed somebody else, and generally tried to avoid admitting what the court already knew. The court gave him every opportunity to fess up, including this one: “When asked whether he would be surprised to find out that the citation Perkins v. Fed Fruit & Produce Co., 945 F.3d 1242, 1251 (10th Cir. 2019) appearing on page 6 … did not exist as an actual case, Mr. Kachouroff indicated that he would be surprised.” How could that be?!

Not until this Court asked Mr. Kachouroff directly whether the Opposition was the product of generative artificial intelligence did Mr. Kachouroff admit that he did, in fact, use generative artificial intelligence. After further questioning, Mr. Kachouroff admitted that he failed to cite check the authority in the Opposition after such use before filing it with the Court—despite understanding his obligations under Rule 11…. Even then, Mr. Kachouroff represented that he personally outlined and wrote a draft of a brief before utilizing generative artificial intelligence. Given the pervasiveness of the errors in the legal authority provided to it, this Court treats this representation with skepticism.

I too “treat that representation with skepticism,” but also amusement. Either way, he was admitting he didn’t know what he was doing.

The upshot is that the court ordered the lawyers to show cause why they and their clients should not be sanctioned for violating Rule 11 and the Rules of Professional Conduct, having trusted generative artificial “intelligence” with writing a brief for them and then not checking its “work.”

Will they ask ChatGPT to handle their response to the order? I will be only mildly surprised if they do.

Scroll to Top