Skip to content(if available)orjump to list(if available)

Judge denies creating “mass surveillance program” harming all ChatGPT users

strictnein

Maybe it's the quotes selected for the article, but it seems like the judge simply doesn't get the objections. And the reasoning is really strange:

"Even if the Court were to entertain such questions, they would only work to unduly delay the resolution of the legal questions actually at issue."

So because the lawsuit pertains to copyright, we can ignore possible constitutional issues because it'll make things take longer?

Also, rejecting something out of hand simply because a lawyer didn't draft it seems really antithetical to what a judge should be doing. There is no requirement for a lawyer to be utilized.

judofyr

> … but it seems like the judge simply doesn't get the objections. And the reasoning is really strange

The full order is linked in the article: https://cdn.arstechnica.net/wp-content/uploads/2025/06/NYT-v.... If you read that it becomes more clear: The person who complained here filed a specific "motion to intervene" which has a strict set of requirements. These requirements were not met. IANAL, but it doesn't seem too strange to me here.

> Also, rejecting something out of hand simply because a lawyer didn't draft it seems really antithetical to what a judge should be doing. There is no requirement for a lawyer to be utilized.

This is also mentioned in the order: An individual have the right to represent themselves, but a corporation does not. This was filed by a corporation initially. The judge did exactly what a judge what supposed to do: Interpret the law as written.

yieldcrv

So the arguments are sound but the procedure wasnt followed so someone else just needs to follow the procedure and get our chats deletable?

otterley

The judge went further to say the arguments weren't sound, either.

Alive-in-2025

That appears to be the case to me too. There are so many similar services that people use that could fall victim to this same issue of privacy . We should have extremely strong privacy laws preventing this, somehow blocking over-broad court orders. And we don't. Imagine these case:

1. a court order that a dating service saves off all chats and messages between people connecting on the service (instead of just saving off say the chats from a suspected abuser)

2. saving all text messages going through a cell phone company

3. how about saving all google docs? Probably billions of these a day are being created.

4. And how has the govt not tried to put out a legal request that signal add backdoors and save all text messages (because there will no doubt be nefarious users like our own secretary of defense). I think it would take a very significant reason to succeed against a private organization like signal.

The power and reach of this makes me wonder if the US govt already has been doing this to normal commercial services (outside of phone calls and texting). I recall reading back in the day they were "tapping" / legally accessing through some security laws phone company trunks. And then we learned about tapping google communications from Edward Snowden.

salawat

>2. saving all text messages going through a cell phone company

Point of order, phone companies already do that. Third Party Doctrine. I don't believe they should, but as of right now, that's where we're at.

chrisweekly

"saving off"?

this is a strange turn of phrase

delusional

> We should have extremely strong privacy laws preventing this, somehow blocking over-broad court orders

Quick question. Should your percieved "right to privacy" supersede all other laws?

To extrapolate into the real world. Should it be impossible for the police to measure the speed of your vehicle to protect your privacy? Should cameras in stores be unable to record you stealing for fear of violating your privacy?

andyferris

I think there's an idea akin to Europe's "right to be forgotten" here.

We can all observe the world in the moment. Police can obtain warrants to wiretap (or the digitial equivalent) suspects in real-time. That's fine!

The objection is that we are ending up with laws and rulings that require a recording of history of everyone by everyone - just so the police can have the convenience of trawling through data everyone reasonably felt was private and shouldn't exist except transiently? Not to mention that perhaps the state should pay for all this commercially unnecessary storage? Our digital laws are so out-of-touch with the police powers voters actually consented to - that mail (physical post) and phone calls shall not be in intercepted except under probable cause (of a particular suspect performing a specific crime) and a judge's consent. Just carry that concept forward.

On a technical level, I feel a "perfect forward secrecy" technique should be sufficient for implementers. A warrant should have a causal (and pinpoint) effect on what is collected by providers. Of course we can also subpoena information that everyone reasonably expected was recorded (i.e. not transient and private). This matches the "physical reality" of yesteryear - the police can't execute a warrant for an unrecoreded person-to-person conversation that happened two weeks ago; you need to kindly ask one of the conversents (who have their own rights to privacy / silence, are forgetful, and can always "plead the 5th").

gridspy

Your two examples don't map to the concern about data privacy.

Speed cameras only operate on public roads. The camera in the store is operated by the store owner. In both cases one of the parties involved in the transaction (driving, purchasing) is involved in enforcement. It is clear in both cases that these measures protect everyone and they have clear limits also.

Better examples would be police searching your home all the time, whenever they want (This maps to device encryption).

Or store owners surveilling competing stores / forcing people to wear cameras 24/7 "to improve the customer experience" (This maps to what Facebook / Google try to do, or what internet wire tapping does).

freejazz

>So because the lawsuit pertains to copyright, we can ignore possible constitutional issues because it'll make things take longer?

What constitutional issues do you believe are present?

> There is no requirement for a lawyer to be utilized.

Corporations must be represented by an attorney, by law. So that's not true outright. Second, if someone did file something pro-se, they might get a little leeway. But the business isn't represented pro-se, so why on earth would the judge apply a lower standard appropriate for a pro-se party so a sophisticated law firm, easily one of the largest and best in the country?

When you are struggling to reason around really straightforward issues like that, it does not leave me with confidence about your other judgments regarding the issues present here.

strictnein

> When you are struggling to reason around really straightforward issues like that, it does not leave me with confidence about your other judgments regarding the issues present here.

Or, perhaps, that's not something known by most. I didn't struggle to understand that, I simply didn't know it. Also, again, the article could have mentioned that, and I started my statement by saying maybe the article was doing a bad job conveying things.

> What constitutional issues do you believe are present?

This method of interrogation of online comments is always interesting to me. Because you seem to want to move the discussion to that of whether or not the issues are valid, which wasn't what I clearly was discussing. When you are struggling to reason around really straightforward issues like that, it does not leave me with confidence about your other judgments regarding the issues present here.

Aeolun

Now that you’ve both done it, can we stop with the ad hominem?

paulddraper

> What constitutional issues do you believe are present?

4th Amendment (Search and Seizure)

kopecs

Do you think the 4th amendment enjoins courts from requiring the preservation of records as part of discovery? The court is just requiring OpenAI to maintain records it already maintains and segregate them. Even if one thinks that _is_ a government seizure, which it isn't---See Burdeau v. McDowell, 256 U.S. 465 (1921); cf. Walter v. United States, 447 U.S. 649, 656 (1980) (discussing the "state agency" requirement)---no search or seizure has even occurred. There's no reasonable expectation of privacy in the records you're sending to OpenAI (you know OpenAI has them!!; See, e.g., Smith v. Maryland, 442 U.S. 735 (1979)) and you don't have any possessory interest in the records. See, e.g., United States v. Jacobsen, 466 U.S. 109 (1984).

tptacek

This doesn't seem especially newsworthy. Oral arguments are set for OpenAI itself to oppose the preservation order that has everyone so (understandably) up in arms. Seems unlikely that two motions from random ChatGPT users were going to determine the outcome in advance of that.

gridspy

Seems that a judge does not understand the impact of asking company X to "retain all data" and is unwilling to rapidly reconsider. Part of what makes this newsworthy is the impact of the initial ruling.

aydyn

The judge is clearly not caring about this issue so arguing before her seems pointless. What is the recourse for OpenAI and users?

Analemma_

You don't have any recourse, at least not under American law. This a textbook third-party doctrine case: American law and precedent is unambiguous that once you voluntarily give your data to a third party-- e.g. when you sent it to OpenAI-- it's not yours anymore and you have no reasonable expectation of privacy about it. Probably people are going to respond to this with a bunch of exceptions, but those exceptions all have to be enumerated and granted specifically with new laws; they don't exist by default, and don't exist for OpenAI.

Like it or not, the judge's ruling sits comfortably within the framework of US law as it exists at present: since there's no reasonable expectation of privacy for chat logs sent to OpenAI, there's nothing to weigh against the competing interest of the active NYT case.

like_any_other

> once you voluntarily give your data to a third party-- e.g. when you sent it to OpenAI-- it's not yours anymore and you have no reasonable expectation of privacy about it.

The 3rd party doctrine is worse than that - the data you gave is not only not yours anymore, it is not theirs either, but the governments. They're forced to act as a government informant, without any warrant requirements. They can say "we will do our very best to keep your data confidential", and contractually bind themselves to do so, but hilariously, in the Supreme Court's wise and knowledgeable legal view, this does not create an "expectation of privacy", despite whatever vaults and encryption and careful employee vetting and armed guards standing between your data and unauthorized parties.

kopecs

I don't think it is accurate to say that the data becomes the government's or they have to act as an informant (I think that implies a bit more of an active requirement than responding to a subpoena), but I agree with the gist.

AnthonyMouse

> You don't have any recourse, at least not under American law.

Implying that the recourse is to change the law.

Those precedents are also fairly insane and not even consistent with one another. For example, the government needs a warrant to read your mail in the possession of the Post Office -- not only a third party but actually part of the government -- but not the digital equivalent of this when you transfer some of your documents via Google or Microsoft?

This case is also not the traditional third party doctrine case. Typically you would have e.g. your private project files on Github or something which Github is retaining for reasons independent of any court order and then the court orders them to provide them to the court. In this case the judge is ordering them to retain third party data they wouldn't have otherwise kept. It's not clear what the limiting principle there would be -- could they order Microsoft to retain any of the data on everyone's PC that isn't in the cloud, because their system updater gives them arbitrary code execution on every Windows machine? Could they order your home landlord to make copies of the files in your apartment without a warrant because they have a key to the door?

comex

The third-party doctrine has been weakened by the Supreme Court recently, in United States v. Jones and Carpenter v. United States. Those are court decisions, not new laws passed by Congress. See also this quote:

https://en.wikipedia.org/wiki/Third-party_doctrine#:~:text=w...

If OpenAI doesn't succeed at oral argument, then in theory they could try for an appeal either under the collateral order doctrine or seeking a writ of mandamus, but apparently these rarely succeed, especially in discovery disputes.

otterley

Justice Sotomayor's concurrence in U.S. v. Jones is not binding precedent, so I wouldn't characterize it as weakening the third-party doctrine yet.

anon7000

Yep. This is why we need constitutional amendments or more foundational laws around privacy that changes this default. Which should be a bipartisan issue, if money had less influence in politics.

AnthonyMouse

This is the perverse incentives one rather than the money one. The judges want to order people to do things and the judges are the ones who decide if the judges ordering people to do things is constitutional.

To prevent that you need Congress to tell them no, but that creates a sort of priority inversion: The machinery designed to stop the government from doing something bad unless there is consensus is then enabling government overreach unless there is consensus to stop it. It's kind of a design flaw. You want checks and balances to stop the government from doing bad things, not enable them.

impossiblefork

OpenAI is the actual counterparty here though and not a third party. Presumably their contracts with their users are still enforceable.

Furthermore, if the third party doctrine is upheld in its most naïve form, then this would breach the EU-US Data Privacy Framework. The US must ensure equivalent privacy protections to those under the GDPR in order for the agreement to be valid. The agreement also explicitly forbids transferring information to third parties without informing those whose information is transferred.

mrweasel

They probably do already, but won't this ruling force OpenAI to operate separate services for the US and EU? The US users must accept that their logs are stored indefinitely, while an EU user is entitled to have theirs delete.

yxhuvud

Well, I don't think anyone is expecting the framework to work this time either after earlier tries has been invalidated. It is just panicked politicians trying to kick the can to avoid the fallout that happens when it can't be kicked anymore.

dylan604

appealing whatever ruling this judge makes?

otterley

You can't appeal a case you're not a party to.

baobun

It's a direct answer to the question what recourse OpenAI has.

Users should stop sending information that shouldn't be public to US cloud giants like OpenAI.

freejazz

Stop giving your information to third parties with the expectation that they keep it private when they wont and cannot. Your banking information is also subject to subpoena... I don't see anyone here complaining about that. Just the hot legal issue of the day that programmers are intent on misunderstanding.

FpUser

>"What is the recourse for OpenAI and users?"

Start using services of countries who are unlikely to submit data to the US.

mvdtnz

"OpenAI user" is not an inherent trait. Just use another product, make it OpenAI's problem.

lifeisstillgood

We (various human societies) do need to deal with this new ability to surveil every aspect of our lives. There are clear and obvious benefits in the future - medicine, epidemiology will have enormous reservoirs of data to draw on, entire new fields of mass behavioural psychology will come into being (I call it MAssive open online psychology or moop) and we might even find governments able to use minute by minute data of their citizens to you know, provide services to the ones they miss…

But all of this assumes a legal framework we can trust - and I don’t think this comes into being piecemeal with judges.

My personal take is that data that, without the existence of activity of a natural human, data that woukd not exist or be different must belong to that human - and that it can only be held in trust without explicit payment to that human if the data is used in the best interests of the human (something something criminal notwithstanding)

Blathering on a bit I know but I think “in the best interests of the user / citizen is a really high and valuable bar, and also that by default, if my activities create or enable the data,it belongs to me, really forces data companies to think.

Be interested in some thoughts

meowkit

Zero knowledge proofs + blockchain stream payments + IPFS or similar based storage with encryption and incentive mechanisms.

Its still outside the overton window (especially on HN), but the only way that I’ve seen where we can get the benefits of big data and maintain privacy is by locking the data to the user and not aggregating it in all these centralized silos that then are incentivized to build black markets around that data.

warkdarrior

How do you apply ZKPs to ChatGPT queries?

NoImmatureAdHom

As far as cryptographic solutions go: what would be ideal is homomorphic encryption, where the server can do the calculations on data it can't decrypt (your query) and send you something back that only you can decrypt. Assuming that's unworkable, we could still have anonymity via cryptocurrency payments for tokens (don't do accounts) + ipfs or tor or similar. You can carry around your query + answer history with you.

unyttigfjelltol

The "judge" here is actually a magistrate whose term expires in less than a year.[1]

Last time I saw such weak decision-making from a magistrate I was pleased to see they were not renewed, and I hope the same for this individual.

[1] https://nysd.uscourts.gov/sites/default/files/2025-06/Public...

cvoss

Not sure why scare quotes are warranted here. Your so-called '"judge"' is, in fact, a judge.

https://www.uscourts.gov/about-federal-courts/types-federal-...

unyttigfjelltol

Magistrate judges are more variable, not subject to Senate confirmation, do not serve for life, render decisions that very often are different in character from those of regular judges-- focusing more on "trees" than "forest". Without consent, their scope of duties is limited and they cannot simply swap in for a district judge. They actually are supervised by a district judge, and appeals, as it were, are directed to that officer not an appellate court.

In a nutshell, I used quotes to indicate how the position was described by the article. These judidical officers are not interchangeable with judges in the federal system, and in my experience this distinction is relevant to both why this person issued the kind of decision they did, and what it means for confidence in the US justice system.

freejazz

I don't think a routine application of 3rd party doctrine should sink a magistrate judge.

unyttigfjelltol

So in your view, there is no expectation of privacy in anything typed into the Internet. And, if a news organization, or a blogger, or whoever, came up with some colorable argument to discover everything everyone types into the Internet, and sued the right Internet hub-- you think this is totally routine and no one should be annoyed in the least, and moreover no one should be allowed to intervene or move for protective order, because it would be more convenient for the court to just hand over all the Internet to the news or blogger or whoever.

It's precisely that perspective that I think should sink a magistrate, hundreds of times over.

notnullorvoid

Even if OpenAI and other LLM providers were prohibited by law not to retain the data (opposite of this forced retention), no one should trust them to do so.

If you want to input sensitive data into an LLM, do so locally.

sebastiennight

> prohibited by law not to retain the data

is the same as forced retention. You've got a double negative here that I think you didn't intend.

merksittich

Previous discussion:

OpenAI slams court order to save all ChatGPT logs, including deleted chats

https://news.ycombinator.com/item?id=44185913

Akranazon

> Judge denies creating “mass surveillance program” harming all ChatGPT users

What a horribly worded title.

A judge rejected the creation of a mass surveillance program?

A judge denied that creating a mass surveillance program harms all ChatGPT users?

A judge denied that she created a mass surveillance program, and its creation (in the opinion of the columnist) harms all ChatGPT users?

The judge's act of denying resulted in the creation of a mass surveillance program?

The fact that a judge denied what she did harms all ChatGPT users?

(After reading the article, it's apparently the third one.)

elefanten

The third one is the only correct way to interpret the title.

delusional

It's crazy how much I hate every single top level take in this thread.

Real human beings actual real work is allegedly being abused to commit fraud at a massive scale, robbing those artist of the ability to sustain themselves. Your false perception of intimacy while asking the computer Oracle to write you smut does not trump the fair and just discovery process.

devmor

I find it really strange how many people are outraged or shocked about this.

I have to assume that they are all simply ignorant of the fact that this exact same preservation of your data happens in every other service you use constantly other than those that are completely E2EE like signal chats.

Gmail is preserving your emails and documents. Your cell provider is preserving your texts and call histories. Reddit is preserving your posts and DMs. Xitter is preserving your posts and DMs.

This is not to make a judgement about whether or not this should be considered acceptable, but it is the de facto state of online services.

null

[deleted]

akimbostrawman

fighting microsoft to get the illusion of privacy is the modern day fight against windmills