Ninth Circuit Takes a Wrecking Ball to Internet Personal Jurisdiction Law
49 comments
·April 23, 2025dpifke
otterley
To be clear, this isn't a choice-of-law case. It's not about whether California law applies. It's about whether a court in California has jurisdiction; that is, whether it can hear the case at all.
Quarrel
It also isn't against the Californian company. It was Shopify which is not a Californian company.
Seems bizarre to me that several lower courts ruled in favor of Shopify though..
bluGill
Shopify on behalf of a california company
dpifke
Could a tort claim under California state law be heard in any other court (assuming no accompanying Federal claims)?
otterley
Yes. Choice-of-law terms are frequently found in contracts. When interpreting the law, the court with jurisdiction will do its best to refer to and interpret existing law of the state in question to the case.
jMyles
> And if Shopify wants to take a cut of every sale from retailers based in California, they should be willing to comply with California law as well,
For the moment, for purpose of consumer protections, fine. But on longer time scales, I'm not sure. Does it really make sense for legacy states to be able to bind transacations on the internet? Doesn't that just make it a very large intranet?
Obviously information refuses to be stopped by borders. Are we going to have a situation where states of various sizes try to claim jurisdiction, but only those with sufficiently violent tendencies (and thus the ability to achieve compliance by fear) succeed? Won't that corrupt the internet even worse than an absence of state-based consumer protections?
If two people who live 500 miles apart in the area currently claimed by the State of California, but do not recognize the State of California, and regard themselves as citizens of the internet, and, who is right, them, of the government of California?
Most of us will probably say that there is some social contract by which, for better or worse, the State of California is right.
But what if, in 100 years, California goes bankrupt. Does that change the calculus? If so, why? And does it change retroactively, for the purposes of historical classification of internet transactions? The diplomatic and economic affairs of state don't change the operation of internet protocols. It's hard to even fully imagine how to create an internet whose shapes are coterminous with the boundaries asserted by various states.
I'm broadly skeptical of any judicial rulings which extend the laws of the legacy states onto the internet, even if they appear to be on the side of short-term justice. This whole thing is starting to feel like a bandaid better ripped off quickly.
josephg
> Does it really make sense for legacy states to be able to bind transacations on the internet?
Yes.
We have a problem right now where the only place democracy, sensible laws and due process take place is in meatspace.
The internet - insomuch as it’s a real place, is a feudal society. It’s made up of small fiefdoms (websites) and some larger kingdoms which exert tyrannical power within their borders. They watch everything you do - usually to advertise to you. And they can banish you at a moments notice if doing so would result more profit for their rulers.
There’s an interesting argument you can make that the internet should be its own sovereign space. “Information wants to be free” and all that. Maybe if the internet was created 200 years ago, during the period of time when constitutions were being written everywhere, we would have created one for the internet. And then, maybe, the internet could have policies and courts and rules that uphold the rights of people. But that hasn’t happened. We have, through our collective inaction, delegated judicial oversight of the internet to sovereign states in meatspace. And thank goodness. Somebody needs to tell internet companies that my personal data is not for sale. Or tell Apple that they aren’t entitled to 15-30% of Netflix’s revenue after already selling a user their phone. (And don’t they dare redirect users to their website!)
If us technologists won’t govern ourselves, we delegate that important job to the state of California. To the European Union. To Australia’s department of fair trade & ACCC. And so on. It means we get a fractured Internet. But people have inalienable rights that need to be defended. Those rights must not be undermined just because we’re online and there’s a profit to be made.
nullc
Laws are also the most effective tool for destroying rights, arguably much more so than protecting them.
So the flip side of your position that someone needs to be subject to a foreign law when dealing with a foreign party because otherwise that parties right might be stommped is that they also need the means to block interactions with that foreign party so their own rights aren't potentially stomped.
In the case where there are sales you might actually know where the other parties reside, but in the majority of interactions online you don't and there is no great means to control your exposure to other jurisdictions.
thunderfork
[dead]
ang_cire
This ruling is correct, and good.
> If a company develops a web-based business for the purpose of conducting online transactions in all 50 States, it should not be surprised that it may be sued in any State for unlawful transactions that may occur within that State.
Obviously. But the author calls this "chilling". Without this, companies could circumvent state laws, to conduct actions that are illegal in that state within that state, simply by headquartering or hosting in another. That would be absurd.
It would create a race-to-the-bottom of consumer rights, where states wanting business tax revenue are incentivized to make their states surveillance/ data harvesting/ consumer exploitation havens, whose businesses could then operate across all other states freely.
mmooss
Think of the mom & pop handmade craft business selling ceramics online. They couldn't afford to sell their state.
cthalupa
The ruling explicitly addressees this by talking about the scale and scope of Shopify's business in California. Read the 'express aiming' portions.
Trying to conflate a mom & pop shop vs. shopify is silly and not something that the court attempted to do.
3np
> What Could Shopify Have Done Differently?
For completion I think "cease to insecurely extract, aggregate and abuse all that user data" should also be mentioned as an alternative to the different ways they could skirt regulation.
clucas
You’re misunderstanding the question - he’s asking how Shopify could avoid jurisdiction, not avoid this suit. Jurisdiction is a threshold question before you get to the merits… maybe Shopify did the bad thing, maybe they didn’t, but before we decide that, we need to determine if California law even applies to Shopify.
The author seems to think that there should be some way for Shopify to avoid jurisdiction while still offering services in California, but I don’t really understand why he thinks so.
otterley
As a former student of the author, I don't think he's saying they should be able to avoid jurisdiction. I think he was musing on whether it would even be possible under this new Ninth Circuit framework/test. He concludes it's unlikely, and hence for Shopify (or any other company putting cookies in browsers) to have any chance of avoiding it, they're going to have to appeal to SCOTUS.
xbar
Not at all. I think he rightly concludes that jurisdiction is completely avoidable by geoblocking California.
It is baffling to hear the author ask the question “did Shopify ‘expressly aim an intentional act at California?” And subsequently conclude that Shopify’s entire business model is in doubt if it doesn’t do business in California.
johnea
I was going to quote, and respond in almost the exact same way.
The only change I would make to your suggestion would be to remove the word "insecurely".
They shouldn't extract or aggregate user data in any fashion whatsoever.
getcrunk
Backstory from eff:
https://www.eff.org/deeplinks/2024/07/courts-should-have-jur...
ang_cire
For anyone not clicking through, EFF supports the 9th Circuit in this case.
abtinf
Unfortunately, EFF is not principled. Had the facts of this case been slightly different—say, state law were to require some privacy invading practice that the foreign company does’t want—then EFF would take the opposite stand.
Glyptodon
Maybe the line of reasoning offered and argued against is dubious. But IMO there are literally dozens of other arguments that will come to the same conclusion if you want to avoid hand waving about the particular bits the author raises.
By and large states having different laws is a pain, but arguing that you can do business in every state while only following the laws of one state is a very messy rejection of state's rights, and leads to using the commerce clause to basically negate most state level regulations and jurisdiction.
Alupis
I suspect this has something to do with "Shop Pay", Shopify's own payment system used on most (all?) Shopify stores. It enables you to have saved payment information for any Shopify store you come across, facilitating one-click checkout even if you have never shopped on that particular brand/website before. Webshop operators love it because it is very good at fraud detection (due to the pooled data on the backend), and removes barriers at checkout (needing your wallet, fill out an address form, etc). As far as I'm aware, it's optional on the Shopify platform. Using Shop Pay for payment is optional on the consumer level.
I suspect Shopify's terms inform their customers (webshop operators) that they are responsible for disclosure, etc and being compliant with state privacy laws - however since majority of web shops are exempt (due to small size, revenue, etc), these shops did not (knowingly or otherwise) publish these terms. That's just speculation on my part...
If this is true, I find this case troubling and weak, and hope it is overturned. It is squarely on the shop operator to be compliant - Shopify is just a platform vendor and shoppers are not Shopify customers; rather, they are customers of the shop. This seems to be akin to suing Google because a website uses Google Analytics but didn't disclose it in their privacy statement - silly...
This particular case gives me ADA and Prop65 vibes... lots of bottom-feeding lawyers using serial plaintiffs to extort businesses out of money. At least in this case they're going after someone with deep pockets and not just small businesses...
ndriscoll
I'm not familiar enough with California's law to know whether companies like Shopify/Google are meant to be liable (in the sense that the law says so), but certainly it would be a great thing if the companies actually performing the mass surveillance (Google, Shopify) were liable even if the payload deliverer is small. Absolutely what is needed is law saying that Google can be sued (or better, held criminally liable for harassment/stalking) for spying on people through its Google Analytics program, among others.
Relentlessly stalking millions of people makes it millions of times worse than stalking one person, not somehow okay.
rkagerer
Or hold enough of those small actors to account that nobody wants to do business with Google Analytics in its current form.
It disgusts me that companies who want to transact with me don't vet their partners better. Off-Meta is another one that's despicable. Companies like my bank or their partners have NO business uploading lists of their users to third parties like that (even if it was induced by use of their analytics SDK's).
nozzlegear
> If this is true, I find this case troubling and weak, and hope it is overturned. It is squarely on the shop operator to be compliant - Shopify is just a platform vendor and shoppers are not Shopify customers; rather, they are customers of the shop. This seems to be akin to suing Google because a website uses Google Analytics but didn't disclose it in their privacy statement - silly...
Most of my work is in the Shopify app dev ecosystem, and while I haven't been following this case very closely, I do think it's ironic how Shopify is behaving here given the privacy standards they enforce on their app developers.
Some context: all Shopify app developers are required to follow the EU's GDPR rules for customer data, full stop. Your app must implement Shopify's mandatory GDPR webhooks. You must delete customer data when a shop's customer is deleted; you must produce all data you store on a shop's customer within 7 days upon receipt of a certain GDPR webhook; and you must delete all the data you store on the shop itself after the shop uninstalls your app.
Additionally, if your app requires access to any customer data (whether its via the Customer API, or via other APIs e.g. to get the name of a customer who placed an order), you need to apply for access to that data on an app-by-app basis – replete with an explanation for why your app needs the data. Shopify's app store staff has to manually review and approve that data access application before you can publish your app on their app store.
To be clear, I think these restrictions are a good thing†, as apps used to have access to a veritable firehose of private customer data. But it's ironic to see Shopify enforce such standards on their app developers, while at the same time arguing that they should be able to track their own potential customers anywhere and everywhere across the internet regardless of privacy laws.
† Though I think it's a little odd that a Canadian company is making me, an American app developer, think about/adhere to the EU's GDPR rules. Not to mention other privacy laws like the one in California. Why not just call it "Shopify's Privacy Standards?"
decimalenough
Shopify is not enforcing those rules out of the goodness of its heart. It is in Shopify's best interest that retailers have as little information about their customers as possible and that it's as difficult as possible to export the data they do have out of Shopify, because that ties retailers to the Shopify ecosystem.
chocolatkey
Stripe also has a version of this called “Link”, which uses SMS authentication. Based on Stripe data on multiple platforms I have access to, quite a high percentage of people use it, probably due to how hard it’s pushed by the UI when adding a payment method
null
pessimizer
> It is squarely on the shop operator to be compliant - Shopify is just a platform vendor and shoppers are not Shopify customers; rather, they are customers of the shop.
I disagree energetically. If Shopify wants to run a service identifying people between every site that it serves as a backend to, it should ask those people if they want to be included in that. The only alternative to stop the illegal activity otherwise is to print a list of Shopify's customers, and visit (and sue) them one by one in California. Shopify is running the service, and the shop owner probably doesn't even know how it works.
I'd even think that a shop owner sued over this should in turn be able to sue Shopify. If Shopify knows that something it does is not legal in California, it should tell its clients who may do business in California.
Alupis
You opt-into using Shop Pay, as a consumer. By default you are in "guest" mode.
> If Shopify knows that something it does is not legal in California
This is what is being debated. This ruling is mostly expected out of the 9th... we'll see what happens when a real court hears this case.
Aloisius
What are the odds the Supreme Court hears this?
healsdata
> then more privacy-protective option are not feasibly available to Shopify
I haven't laughed that hard in awhile. Poor Shopify, they couldn't possibly protect the privacy and data of their customers.
TrueDuality
I was with you until this paragraph:
> The broadest possible interpretation of this ruling is that any website that downloads any digital asset–cookies, javascript, heck maybe even HTML–onto a California resident’s computer can be sued in California, even if the website doesn’t know where the users are. If this is correct, the majority effectively would be saying: if you place a cookie on a reader’s device, you’ve done something more than passive publishing (i.e., you can passively publish without the cookie) and must accept the jurisdictional consequences.
This feels so close, but a little off my interpretation. In Collin's concurrence, he specifically calls out that the required parties "minimum-contacts" in the transaction were both in the state as part of the reason for the unambiguous jurisdiction, with Stripe being a third unexpected party to the information. To insert themselves is in effect inserting themselves into the jurisdiction as well. This paragraph:
> When a State specifically regulates the conduct of electronic systems with respect to transactions within its borders, the as-intended operation of those systems within that State is the relevant tortious conduct for minimum-contacts purposes, and that conduct is attributable to those persons who deliberately intended that such systems reach into that State and operate in that manner when they do so...
For me, this implies that third-party services not required or expected when a user interacts with a site that run scripts or set cookies for anything outside that "minimum-contacts" requirement is liable for what they do with that data, access, and what that code does.
hn_acker
The full title is:
> Ninth Circuit Takes a Wrecking Ball to Internet Personal Jurisdiction Law–Briskin v. Shopify
djoldman
The part where many may object:
> First, the majority might say that Shopify should not engage in privacy-invasive activities. I didn’t invest the energy to figure out the irreducible privacy elements of the plaintiffs’ claims, but if using cookies to track users is an essential part of the claim, then more privacy-protective option are not feasibly available to Shopify.
nickff
This seems like a very strange reading of "express aiming"; instead of those words meaning that a person has done something to 'target', it means that the person did not 'expressly avoid'? I am not sure that "expressly aim" has much meaning at all in this reading.
I don't have any horse in this race, though I know the EFF is very popular on HN, and that many people here are also against data collection.
3np
I guess it's just a coincidence that the California Shopify meetup groups are abandoned without notice?
The "online retailer" (IABMFG) in this case is based in California.
A company in California, selling to a customer in California, shouldn't be able to say "California law doesn't apply because my payment processor is Canadian." And if Shopify wants to take a cut of every sale from retailers based in California, they should be willing to comply with California law as well, at least insofar as it applies to the services provided via those California-based retailers' web sites.
(The actual opinion is linked at the bottom of the submission; I humbly suggest folks commenting here should read it first: https://cdn.ca9.uscourts.gov/datastore/opinions/2025/04/21/2...)