A Federal Court Warning About Sharing Legal Issues With AI
Along with the rise of AI tools, people have started using ChatGPT, Claude, etc. as a go-to for legal advice, drafting letters, negotiation postures, forms, strategy, analyzing issues, exploring potential outcomes, and a lot more. It has apparently become a popular place to turn to for arraignment advice before hiring an attorney on criminal law matters.
A federal judge just issued an opinion that should give everyone – everyone – deep concern about dropping any information about any legal issue that may involve litigation. As criminal law is, currently, the only law that can result in a loss of freedom, it is vital to understand what's going on here.

Federal Court Rules AI Conversations Are Not Protected by Attorney-Client Privilege
In United States v. Heppner, decided in mid-February, Judge Jed Rakoff ruled that a defendant’s conversations with a consumer generative-AI tool (in this case Anthropic’s Claude) are not protected by attorney-client privilege and are not protected as work product. The court’s view was direct: an AI tool is not a lawyer. Nothing you tell it is protected by attorney-client privilege. Nor is it the work-product of an attorney because the materials most certainly weren’t prepared by an attorney or at an attorney’s direction. The judge’s ruling left no wiggle room whatsoever.
The implication is straightforward and serious:
when you talk to AI about a legal problem, you are talking to a third party.
Why People Overshare Sensitive Legal Information With AI
AI users go all in. It's not their fault, AI chatbots are designed that way - they seem to really care and they most definitely are nonjudgmental. Users end up uploading . . . well, everything. They draft timelines. They list accounts. They describe what was said on 'that call', what was sent by text, which messages were deleted, what they suspect about money, what they know about wrongdoing, what they think the other side will claim, and what they plan to say when the heat turns up. The AI will not stop them, it will encourage them to give more.
AI platforms are designed to prompt for detail and clarity. In everyday life that can be helpful. In a legal situation it can quietly produce something far more dangerous:
a detailed written record of your case — created by you.
Common Legal Information People Upload to AI Chatbots
- timelines of events
- summaries of conversations
- text messages and emails
- financial details
- theories about what happened
- speculation about what the other side will argue
- drafts of statements they plan to make
In other words, people often build the exact document a prosecutor or opposing attorney would love to see.
Privacy vs. Attorney-Client Privilege: Why AI Chats Aren’t Confidential
The Heppner ruling is a reminder that “privacy” and “privilege” are very, very different things. Attorney-client privilege is a narrow, specific protection for confidential communications between a client and a lawyer for the purpose of legal advice. An AI chatbot is a third party. Even if it feels like a private conversation, it is not.
How AI Conversations Can Become Evidence in Court
In a non-criminal litigation setting, opposing counsel asks for your notes, drafts, “communications with third parties,” and anything you used to “prepare” your position. In criminal matters the prosecution demands all this and more. If you have been using consumer AI tools like a diary, you’ve potentially created a neat, time-stamped record of your strategy, your fears, your leverage points, the things you planned to claim, and the things you hoped nobody would ask about. Ever.
In Heppner, prosecutors fought for those AI documents and got them.
As almost an aside in his ruling, the judge was clear that if those AI documents had been compiled by Heppner's attorneys as they researched his his, checked statutes, looked for similar issues, compiled emails and evidence ... in effect had done what lawyers do in 2026, they would have been safe from the prosecution in the same way lawyers' correspondence with expert witness are privileged and safe.
That distinction matters.
When a lawyer prepares legal strategy, research, and case materials, those materials are generally protected. When a person prepares them themselves with a third party — including an AI tool — those protections may disappear.
When Is It Safe to Use AI for Legal Questions?
None of this means “never use AI.” It means don’t confide in it.
Use it for public, generic questions—terminology, process, checklists you could safely print on a billboard. That's it - and, really, only if you must. Last we checked libraries still worked and were still free.
Safe examples might include:
- looking up legal terminology
- understanding the basic structure of a court process
- reviewing publicly available procedures
- checking definitions or general legal concepts
Unsafe uses include:
- describing your case
- uploading documents or evidence
- explaining what happened
- outlining legal strategy
- discussing negotiations or defenses
Once that information is shared with a third party, control over it may be lost.
The Old Rule vs. the New Rule: Never Put Legal Strategy in AI Chats
The old rule in criminal law was: don’t put anything in writing you don't want to see or hear in the courtroom.
The new rule is: don’t put anything in an AI chat you don't want to see or hear in the courtroom.
Why AI Cannot Replace a Criminal Defense Lawyer
Criminal cases are not puzzles to solve with software. They involve liberty, reputation, and consequences that can last a lifetime.
AI tools can answer questions about vocabulary and procedure. They cannot provide confidential advice, protect your rights, or shield your strategy from prosecutors.
Only a lawyer can do that.
At Knauss Law, we believe every person facing criminal allegations deserves something far more reliable than a chatbot:
the right to experienced, confidential legal counsel.
Not AI counsel.
Real counsel.
If you are facing a criminal investigation, charges, or even an upcoming arraignment, speak with a criminal defense lawyer before sharing details with anyone — including an AI tool.
Your freedom is too important to outsource to software.
