OpenAI updates terms to forbid usage for medical and legal advice
17 comments
·October 31, 2025geor9e
unyttigfjelltol
Doesn’t prohibit brainstorming what to ask your doctor, or which professional consultation to prioritize.
Does prohibit, for illustration, LLM-powered surgical device.
Everything else is “gray area”?
geor9e
Prohibits in the "I'm a sign, not a cop" sense.
There is no way for them to even remotely verify if you are "without appropriate involvement by a licensed professional" in the room, so to a rebellious outlaw, these prohibitions might as well not exist.
SoftTalker
> brainstorming what to ask your doctor
Generally a bad idea. If you want to be a doctor, go to medical school.
ralph84
I don’t want to be a doctor, I just want to fix what ails me. You don’t need an MD to research symptoms.
OutOfHere
The bad idea is to live and die in ignorance. The good idea is to use GPT to find ideas and references that one can then verify. If it were up to the medical establishment, they would block the public from accessing medical research altogether, and they already do this by paywalling much research.
CGamesPlay
Did the headline get changed? It 100% matches what you're calling out: "OpenAI updates terms to forbid usage..."
OutOfHere
OpenAI already blocked public access to custom GPTs that gave medical advice. I had multiple such custom GPTs get blocked from their previously functional shared access.
dangus
While you are correct, the question now becomes whether the disclaimer can ever be removed.
If the AI isn’t smart enough to replace a licensed expert even given unlimited access to everything a doctor would learn in medical, where is the value in the AI?
dragonwriter
Plenty of other automation supports licensed experts without replacing them and has value, so if even AI supports licensed efforts but can never replace them, it could still have value in that application.
null
piskov
Unless the following excludes (which it shouldn’t) personal use vs batch one:
Empower people. People should be able to make decisions about their lives and their communities. So we don’t allow our services to be used to manipulate or deceive people, to interfere with their exercise of human rights, to exploit people’s vulnerabilities, or to interfere with their ability to get an education or access critical services, including any use for:
…
automation of high-stakes decisions in sensitive areas without human review:
- critical infrastructure
- education
- housing
- employment
- financial activities and credit insurance
- legal ===
- medical ===
- essential government services
- product safety components
- national security
- migration
- law enforcement
MrCoffee7
You can still ask questions for medical advice. You just need to phrase the question more like a hypothetical one instead of making it obvious that you are asking for yourself.
SilverElfin
Wouldn’t this affect many prominent startups? Why wouldn’t they move to a competitor? Is OpenAI assuming to just be for consumers?
piskov
What stops others to do the same (if they haven’t already)?
It’s a safe bet: we don’t allow you to ask for medical advice so we are not liable if you do and drink mercury or what have you based on our advice.
_wire_
Haha! What a joke
"You can't believe how smart and capable this thing is, ready to take over and run the world"
(Not suitable for any particular purpose - Use at your own risk - See warnings - User is responsible for safe operation...)
(Pan from home robot clumsily depositing clean dishes into an empty dishwasher to a man in VR goggles in next room making all the motions of placing objects in a box)
Check all services you wish to subscribe ($1000 per service per month): - Put laundry in washing machine - Microwave mac & cheese dinner - Change and feed baby - Get granny to toilet - Fix Windows software update error on PC - Reboot wifi router to restore internet connection
SoftTalker
Standard cop-out that software companies always try to include. They disclaim any warranty of merchantibility and fitness for a particular purpose. So if you try to claim that the software doesn't do what it's supposed to do, they take no responsibility for that.
HN Headline is categorically false.
"you cannot use our services for: provision of tailored advice that requires a license, such as legal or medical advice, without appropriate involvement by a licensed professional"
So, they didn't add any handrails, filters, or blocks to the software. This is just boilerplate "consult your doctor too!" to cover their ass.