Can ChatGPT Still Give You Legal Advice? Here’s What’s Changed
Recent headlines claimed that ChatGPT would “no longer give legal or medical advice” following what appeared to be an October 29 policy update. Some outlets presented a full ban, while others described it as a shift toward calling ChatGPT an “educational tool.”
The reality is more nuanced. OpenAI’s usage policies now clarify that the system should not provide tailored advice requiring a licensed professional. That doesn’t mean ChatGPT has been stripped of usefulness.
Here’s what the recent changes mean for you if you’re considering using ChatGPT as your personal injury lawyer.
What Can ChatGPT Still Do? What Can’t It Do?
ChatGPT can:
- Explain legal concepts in plain terms
- Break down legal terminology
- Outline standard procedures in many practice areas
ChatGPT cannot:
- Draft a personalized legal strategy for your case
- Provide specific legal recommendations unique to you
- Replace the judgment, accountability, or licensing of a real attorney
This distinction matters because the line between “general info” and “personalized legal advice” is exactly where unauthorized practice and malpractice liability live.
“I spent nearly five hours in a deposition the other day where the defense was being evasive and trying to play games,” says Parham Nikfarjam, Senior Trial Attorney at J&Y Law. “That’s where experience and strategy come in. Those are things AI can’t replicate. A lawyer doesn’t just ask questions. They read the room, change tone, apply pressure, and pivot when needed. ChatGPT can’t do that.”
For a free legal consultation, call (424) 453-2310
Are ChatGPT Conversations Confidential or Privileged?
No, they are not. Communications with ChatGPT are not protected by attorney-client privilege. In fact, a recent legal case highlighted how user chats with AI became subject to broad preservation orders.
Never input sensitive or identifying information about your case into a public AI tool.
How Many People Are Using ChatGPT Thinking It’s Legal Advice?
While there is no official figure for how many people treat ChatGPT as their lawyer, we have some strong indicators:
- A recent study from Pew Research found that about 60% of U.S. adults have used ChatGPT for advice or information.
- Approximately 34% of adults overall have used it, with an even higher share among younger people.
- Courts have recorded sanctions where litigants or lawyers relied heavily on AI-generated filings and fake case citations.
Usage is widespread, but treating ChatGPT as a substitute for a licensed attorney remains risky.

Click to contact our personal injury lawyers today
Why Does the Distinction Between “General Information” and “Tailored Advice” Matter?
OpenAI’s policy serves three key compliance functions:
- Prevents tools from acting like unlicensed attorneys (avoiding unauthorized practice).
- Reminds users that you are still responsible for your legal decisions, and AI cannot bear that load.
- Shapes how legal and regulatory systems will treat AI tool logs, disclosure, and accountability.
“People ask me all the time, ‘Can’t AI just tell me what to do?’ But the truth is, every case has layers,” admits Nikfarjam “I’ve handled cases where a tow truck driver caused a crash, and only by digging into their background did we uncover a string of prior criminal convictions. That changed the whole strategy. ChatGPT would never know to look for that.”
Complete a Free Case Evaluation form now
Are the ChatGPT Legal Changes Positive?
Some critics argue the policy change is more about compliance optics than substantive reform. They note:
- AI tools may still propagate errors or unreliable output (“hallucinations”).
- Users may assume AI is private or accurate when it isn’t.
- The regulatory framework around AI in law is still emerging, and enforcement is inconsistent.
While the guardrails matter, what really matters is how firms and regulators enforce them.
What Should Courts & Regulators Do?
To keep pace with AI’s rise, courts and regulators should:
- Require certification or verification when AI-assisted filings are used
- Ensure explicit rules for disclosure of AI usage in legal processes
- Clarify professional boundaries between human-only work and AI-assisted work
What Should Law Firms & Legal Aid Providers Do?
If you’re a law firm, adopt written AI-use policies that cover:
- Approved use cases and prohibited uses
- Restrictions on entering client information into public tools
- Steps for verification and supervision
- Client disclosures about AI usage
Training for AI literacy is key. We need to treat the AI’s work like a junior intern’s draft, always subject to review and oversight.
What Should You Know if You’re Representing Yourself?
- Use AI only for background education, not strategy or decisions
- Never rely on AI to calculate deadlines, rights, or core legal strategy
- Always verify any citation you use in your case
- Assume your chats may become discoverable evidence
- Never enter your personal case facts or confidential details into a public chatbot
So Can ChatGPT Still Help With Legal Questions?
Yes, but with the right expectations. Think of ChatGPT as a library with a chat interface, not a substitute for licensed legal counsel, especially when it comes to sensitive topics like personal injury or elder abuse law.
It’s useful for:
- Research summaries
- Legal definitions
- Highlevel procedural explanations
But it’s not for:
- Personalized legal guidance
- Claim drafting or strategy building
- Replacing a lawyer who will answer for your rights
“There’s an art to lawyering that will always be human,” says Nikfarjam. “We’ve been to accident scenes, interviewed witnesses, spotted a cracked sidewalk or a broken latch that turned an average case into a premises liability claim. We’ve been there for grieving families. This work is personal for us. AI will never be able to match that.”
If you’re tired of trying to represent yourself with the help of ChatGPT, give us a call at (877) 735-7035. We don’t get paid unless we win your case!
Call or text (424) 453-2310 or complete a Free Case Evaluation form