Legal AI News: What Your Attorney Wants You to Know About Using AI for Legal Advice

Legal AI News – AI platforms like chat gpt, perplexity, grok & others can feel like a private sounding board. They’re not. Here’s what that means if you end up in court.

By Tim Watson, Attorney at MGW Law Partners Fayetteville, AR

 
About the Author Tim Watson is a practicing attorney at MGW Law Partners. He advises clients on a wide range of civil legal matters and keeps a close eye on how emerging technology — including artificial intelligence — is reshaping the legal landscape.
 
Artificial intelligence is changing just about every industry right now, and the legal world is no different. Most attorneys I know have incorporated AI tools into some part of their practice at this point. That’s not surprising — it’s a genuinely useful technology. What does surprise me, though, is how often clients come into my office having already had long, detailed conversations with ChatGPT or Claude about their legal situation. And I mean detailed — the kind of conversations you’d normally only have with your attorney.
 
That’s where things can get complicated.

Legal AI News: The Gap Between “Helpful” and “Protected”

 
Here’s the thing: when you hire me, the law recognizes a specific and important relationship between us. It’s called attorney-client confidentiality. You can tell me anything about your situation — anything — and I can’t repeat it. That protection exists for a reason. It’s what allows people to be completely honest with their attorney so they can actually get effective legal help.
 
ChatGPT is just like some guy out on the street. You don’t have confidentiality with ChatGPT.
“Confidentiality has value. There is a reason for it. There is a reason you want to go and hire an attorney and be able to talk to somebody.” — Tim Watson, Attorney, MGW Law Partners
AI platforms like chat gpt, grok, perplexity & others can offer a lot of good general information. But that good information comes at a cost: no privilege, no protection, and no legal shield between what you share and what a court might someday demand to see.
The-Gap-Between-Helpful-and-Protected

Legal AI News: A Real Case That Should Give Everyone Pause

 
This isn’t theoretical. There was a case out of federal court in New York where the government was able to obtain and use a criminal defendant’s conversations with an AI chatbot — specifically Claude — as evidence. The defendant had been going to Claude and sharing things like admissions about guilt, questions about what to do. The government discovered all of it.
 
Real-World Example: In a federal New York case, a criminal defendant’s AI chatbot conversations were obtained by the government and introduced as evidence in court. The defendant’s attorney tried to block it. It didn’t work — because those conversations weren’t privileged.
 

His attorney tried to fight it. Couldn’t. Because those conversations weren’t privileged in any way. They were just records that a company held, and the government had a right to get them.

Now, that’s a New York case and it’s not Arkansas law specifically, but it is instructive about how courts are beginning to analyze this. And the reasoning isn’t going away.
A-Real-Case-That-Should-Give-Everyone-Pause

Legal AI News: What About Civil Cases?

 
Most of my practice is civil law, not criminal. And the same concern applies. Any time you’re involved in a civil lawsuit, there’s a discovery process — the other party has the right to ask for documents and communications that are relevant to the case, and you generally have to produce them.
 
Whether courts will routinely order disclosure of a client’s AI chatbot conversations in civil matters is still developing. But here’s what I know for certain: those conversations aren’t privileged the way conversations with your attorney are. That’s the baseline reality. And if the opposing party asks for them, you may have very little ground to stand on in objecting.
 

So Does This Mean You Shouldn’t Use AI at All?

 
No, it’s not that stark. AI isn’t going anywhere, and I’m not suggesting people stop using it for general research or information gathering. What I am saying is that if you’re involved in a case — or if you even think you might be heading toward one — you need to think about your AI conversations the same way you’d think about talking to anyone else out on the street. That stranger might have to testify someday. So might the record of what you said to a chatbot.
 
The bottom line: AI chatbots can give you a lot of useful general information, but they cannot give you attorney-client privilege. If your legal situation involves anything you wouldn’t want a judge or the opposing party to read, do not share it with an AI. Call an attorney instead.
 
I know it sounds self-serving for a lawyer to tell you to call a lawyer. But the reason that advice holds up isn’t about business — it’s that confidentiality genuinely protects you in a way that no AI platform currently can. I’ve got the legal training and the experience to give you advice tailored to your actual situation, and anything you tell me stays between us. That’s not something you get from a chatbot, no matter how good its answers might be.
AI-chatbots

Legal AI News: A Glimpse at Where This Might Be Going

 
Here’s a scenario that might sound far-fetched right now but probably isn’t forever: a courtroom where a chatbot’s conversation history is effectively “cross-examined” — where the full record of what you told an AI is presented, statement by statement, to a jury or a judge. We’re not there yet, thankfully. But the direction things are heading makes that less science fiction than it used to be.
 
The safest approach, right now, is to treat your AI conversations the way you’d treat any other unprotected communication — carefully, and with the assumption that someone else could read them someday.

Frequently Asked Questions

Are my conversations with ChatGPT or Claude protected by attorney-client privilege?

No. Attorney-client privilege only applies when you are communicating with a licensed attorney you have formally hired. AI chatbots like ChatGPT and Claude are not attorneys, and there is no legal protection covering what you share with them. Those conversations could potentially be accessed and used as evidence in legal proceedings.

Can my AI chatbot conversations be used against me in court?

Yes. A federal court case out of New York showed exactly that. A criminal defendant’s conversations with an AI chatbot were discoverable because they were not protected by any privilege. The government was able to access those communications and introduce them as evidence, even over the defendant’s attorney’s objections.

Should I stop using AI for legal questions entirely?

Not necessarily. AI can provide general information, but if you are involved in or anticipate being involved in a legal case, you should treat anything you share with an AI the same way you’d treat a conversation with a stranger on the street — with the awareness it may not stay private.

Does this issue only apply to criminal cases, or does it affect civil cases too?

Both. While a notable case on this issue came out of federal criminal court in New York, the concern extends to civil litigation as well. In civil cases, discovery rules allow the opposing party to request relevant communications. The lack of privilege protection around AI chatbot conversations means the risk applies in any legal proceeding, not just criminal matters.

What should I do instead of using AI when I have a legal issue?

Call a licensed attorney. When you hire an attorney, your conversations are protected by attorney-client confidentiality — a legal protection that does not exist when you talk to an AI chatbot. An attorney also brings professional judgment and experience in the specific area of law that applies to your situation, something no AI can replicate.

What is the main risk of sharing details about my legal situation with an AI chatbot?

The main risk is that those communications are not privileged and could be subpoenaed or obtained through discovery. Unlike conversations with your attorney, there is no legal shield protecting what you tell an AI chatbot from being disclosed to opposing counsel or the government. As one real case has already shown, even damaging admissions shared with an AI can end up as courtroom evidence.

Does it matter which AI platform I use — is one safer than another?

From a legal privilege standpoint, no. Whether you’re using ChatGPT, Claude, Gemini, or any other AI chatbot, none of those conversations are protected by attorney-client privilege. The platform doesn’t change the underlying legal reality: you are sharing information with a third-party service, not a licensed attorney bound by confidentiality.

What if I only asked the AI general questions and didn't share anything specific about my case?

General research questions carry much lower risk than sharing case-specific details. The concern becomes significant when you start describing your actual situation — facts, timelines, your role in events, what you did or didn’t do. That’s the kind of information that has value to an opposing party and that you would normally only share with your attorney.

Can my attorney see what I've already told an AI chatbot before hiring them?

Your attorney can absolutely talk through what you’ve shared with an AI and help you understand any potential exposure. Being upfront with your attorney about prior AI conversations is important precisely because attorney-client privilege protects those discussions with your lawyer — and your attorney needs the full picture to advise you properly.

Is this just a concern for people who are already in trouble legally, or should everyone be careful?
Everyone who thinks there’s even a chance they could end up in a legal dispute should be thoughtful about this. Legal situations can develop quickly and unexpectedly — a contract disagreement, a workplace issue, a property dispute. If there’s any possibility you could be involved in litigation down the road, the conversations you’re having with AI chatbots today could become relevant. It’s worth being cautious before a problem develops, not just after.

This article is provided for informational purposes only and does not constitute legal advice. Laws and legal interpretations vary by jurisdiction. For advice specific to your situation, consult a licensed attorney in your area.

The owner of this website has made a commitment to accessibility and inclusion, please report any problems that you encounter using the contact form on this website. This site uses the WP ADA Compliance Check plugin to enhance accessibility.