In 2021, Frankie Johnson filed a lawsuit against Alabama prison officials for "failing to keep him safe, rampant violence, understaffing, overcrowding and pervasive corruption" after surviving multiple stabbings while incarcerated. As usual, Alabama's attorney general office hired the Butler Snow law firm. This time, a federal judge sanctioned the firm after the legal citations that an attorney used were found to be nonexistent — because AI had made them up.
A California judge also fined two law firms $31,000 after a legal briefing for a denied insurance coverage civil lawsuit used AI-generated "false, inaccurate, and misleading legal citations and quotations." The first firm had made the mistakes. The second failed to catch them. If the judicial review stage hadn't caught it, fake information could have found its way into an official order.
Whether they are defense or prosecution, lawyers introduced a new, messy playing field to legal cases. But AI usage wasn't unanticipated.
In 2017, the American Bar Association (ABA) championed AI tools for offloading "low-level tasks." It wasn't until 2024 that the ABA issued its first formal guidance on using generative AI. At least 106 cases globally have been flagged with "AI hallucinations" so far.
From lawyers filing made-up information to AI-botched police reports and facial recognition misfires, AI is only the newest manifestation of the incompetency and illegitimacy foundational to the criminal legal system. These "mistakes" steal, destroy, and exploit incarcerated people's lives.