Earlier today, I posted about a lawyer filing unverified ChatGPT-generated documents with hallucinated cases. But even if lawyers manage to avoid this, I’m sure many unrepresented litigants will use ChatGPT, Bard, etc., and won’t be able to verify the results properly.
Indeed, a quick search of CourtListener turned up three repositories not shown (1, 2, 3) that expressly stated that they were relying on ChatGPT. This suggests that there are plenty of others who have used ChatGPT but haven’t mentioned it. (To my knowledge, there is no obligation to disclose such matters.)
Also note that self-represented litigants are quite common: even setting prisoner statements aside (since I don’t know how many prisoners have access to ChatGPT), in federal court, “from 2000 to 2019,… 11% of civil filings involving non-prisoners involved self-represented plaintiffs and/or defendants. And I expect it to be even more common in state courts, for example in divorce and child custody cases, where I’m told self-representation is even more current. (Family court plaintiffs may feel like they have to file for divorce, even if they cannot afford an attorney, and defendants may be sued for divorce or for disputes over child care even if they have no money for the plaintiff to recover.) And even without limiting things to such categories of cases, it appears that “The workload of most California judges now consists primarily of cases in which at least one party is self-represented.”
See also this post from late February for an early request along these lines, in which a commenter mentioned that he had used ChatGPT-3 for a state court filing; and see this January article for the history of the DoNotPay ticket litigation. If you know of any cases involving pro litigants using ChatGPT and similar programs, please let me know.