Judge Terminates Case After Lawyer Submitted AI-Generated Fake Citations and Bradbury Quotes
Rare sanctions imposed after attorney repeatedly filed documents with hallucinated legal references
District Judge Katherine Polk Failla ruled that severe sanctions were warranted after attorney Steven Feldman kept responding to requests to correct his filings with new documents that still contained fabricated legal citations. One filing was noteworthy for its conspicuously florid prose—a stark departure from Feldmans typically error-filled writing.
The case represents one of the harshest judicial responses yet to the growing problem of lawyers using AI tools without adequate verification of their outputs.
Analysis
Why This Matters
This case joins a growing list of lawyers sanctioned for AI misuse, signaling courts will not tolerate unchecked reliance on language models.
Background
The AI hallucination problem in legal practice gained attention after a 2023 case where lawyers cited nonexistent cases generated by ChatGPT.
Key Perspectives
Bar associations are scrambling to issue guidance. Tech companies say users must verify outputs.
What to Watch
Whether state bars begin requiring AI disclosure or competency training.