Today Google bricked my Chromebook by force-installing a hidden extension
68 comments
·March 29, 2025scq
yard2010
You're paraphrasing a google source, it might be biased or just lie, they need the money, after all, they can't risk it.
scq
You can read the source for the integration, it lines up exactly with what I said.
You can make up any conspiracy theory you want, but there's no evidence for it.
bri3d
This. Go to chrome://flags and disable “Enable OCR For Local Image Search” and I bet the problem goes away.
It’s a stupid feature for Google to enable by default on systems that are generally very low spec and badly made, but it’s not some evil data slurp. One of the most obnoxious things about enshittification is the corrosive effect it seems to have had on technical users’ curiosity: instead of researching and fixing problems, people now seem very prone to jump to “the software is evil and bad” and give up at doing any kind of actual investigation.
NoNotTheDuo
> but it’s not some evil data slurp.
Not yet anyway. We’ve just seen Amazon change how all Echo’s/Alexa’s operate. It has been local-only for years and years, but now they want the audio data, so they’ve changed the Terms of Service. There’s no reason to believe Google won’t do the same thing sometime in the future.
simpaticoder
"Trauma" is when one horrible experience lowers your danger threshold so much that it triggers on everything, and becomes useless and harmful. "Learning" is when new threat awareness lowers the threshold an 'appropriate amount'. Even if the GP was strictly wrong about their conclusion, in my personal opinion they are quite right to remain vigilant.
Note to parent: it is strictly unfair to lump Google in with Amazon (and if you demonize a good actor long enough, eventually they'll aquiesce since they are already paying the reputational price). However given that they are American corporations operating on similar incentives during the Wild West (or World War) of AI aka WWAI, it makes sense to be suspicious. Heaven knows "reputational downside" is just about the only counter-veiling incentive left, since Trump has stripped consumers and investors of virtually all legal protection (see: CFPB elimination; SEC declines Hawk Tua coin grift prosecution; Trump pardons Trevor Milton). I think it is an excellent time for all of us to be extremely careful with the software we use.
mystified5016
> it’s not some evil data slurp
This puts a dangerous amount of trust onto a company which has very clearly and explicitly signaled to everyone for decades that they do not care one iota about you, your privacy, or your safety.
Assuming that Google isn't doing anything malicious is a very unwise and ill-informed stance to take. If it isn't malicious now, it will be very soon. Absolutely no exceptions.
_Algernon_
Learned helplessness is a common symptom of abuse. Not surprising that we would see it here as well.
bbarnett
It’s a stupid feature for Google to enable
Enter Google 2025!
No longer just terrible search due to lack of care, and conflict of interest.
Instead, now terrible search due to AI, terrible everything due to AI, pushed everywhere and everyplace, degrading and reducing capabilities ecosystem wide.
Ridiculous and just often wrong AI gibberish on search pages, Android camera apps that blur people's faces when trying to "enhance" pics you take, and of course replacing OCR stuff that works well, with some half finished buggy AI junk.
From their doctored and made up AI demos, to an inability to make anything stable or of quality, Google has turned from world class to Nikola in a short couple of years.
TeMPOraL
> One of the most obnoxious things about enshittification is the corrosive effect it seems to have had on technical users’ curiosity: instead of researching and fixing problems, people now seem very prone to jump to “the software is evil and bad” and give up at doing any kind of actual investigation.
There's little here worth being curious about. Tech companies made sure of that. They mostly aren't doing anything particularly groundbreaking in situations like these - they're doing the stupid or the greedy thing. And, on the off chance that the tech involved is in any way interesting, it tends to have decades of security research behind it applied to mathematically guarantee we can't use it for anything ourselves - and in case that isn't enough, there's also decades of legal experience applied to stop people from building on top of the tech.
Nah, it's one thing to fix bugs for the companies back when they tried or pretended to be friendly; these days, when half the problems are intentional malfeatures or bugs in those malfeatures, it stops being fun. There are other things to be curious about, that aren't caused by attempts to disenfranchise regular computer users.
bri3d
> There's little here worth being curious about.
I’m all for OP returning the computer Google broke, as sibling comments have suggested, but the curiosity route would have been fruitful for them too; I’m pretty sure the flag I posted or one of the adjacent ones will fix their issue.
I also personally found this feature kind of interesting of itself; I didn’t know that Google were doing model-based OCR and content extraction.
> on the off chance that the tech involved is in any way interesting, it tends to have decades of security research behind it applied to mathematically guarantee we can't use it for anything ourselves
My current profession and hobby is literally breaking these locks and I’m still not quite sure what you mean here. What interesting tech do you feel you can’t use or apply due to security research?
> there's also decades of legal experience applied to stop people from building on top of the tech.
Again… I’m genuinely curious what technology you feel is locked up in a legal and technical vault?
I feel that we’ve really been in a good age lately for fundamental technologies, honestly - a massive amount of AI research is published, almost all computing related sub-technologies I can think of are growing increasingly strong open-source and open-research communities (semiconductors all the way from PDK through HDL and synthesis are one space that’s been fun here recently), and with a few notable exceptions (3GPP/mobile wireless being a big one), fewer cutting edge concepts are patent encumbered than ever before.
> There are other things to be curious about, that aren't caused by attempts to disenfranchise regular computer users.
If anything I feel like this is a counter-example? It’s an innocuous and valuable feature with a bug in it. There’s nothing weird or evil going on to intentionally or even unintentionally disenfranchise users. It’s something with a feature toggle that’s happing in open source code.
> it's one thing to fix bugs for the companies back when they tried or pretended to be friendly
Here, we can agree. If a company are going to ship automatic updates, they need to be more careful about regressions than this, and they don’t deserve any benefit of the doubt on that.
ashoeafoot
And what were you doing when they took over, dad? Oh, i was intestinal villian, secreting liberal free choice messages for a cooperation that didn't even pay me.
throwaway48476
In other words, tech companies have lost the benefit of doubt.
That's what a decade of enshittification gets them.
oefrha
A decade ago people were also posting these outrage posts about Big Tech (and Small Tech) that more often than not turned out to be bugs/nothingburgers. I was here.
acuntcalleddan
[flagged]
hilbert42
As tossandthrow says, return it as being defect. Reckon this should apply anywhere where warranty applies.
If what you say is correct then the device is (a) not fit for purpose and (b) it's possible you may be able to claim damages on the basis that the manufacturer has changed its modus operandi without your permission or consent and it's now incompatible with the way you work, etc., etc.
If Google reckons it had the right to alter your device because you agreed to its EULA, then it seems you'd still have a case on grounds that it no longer functions as it should.
There are only two things that will stop these bastards—them realizing such behavior is draining money from their hip pockets and proper consumer and privacy legislation.
But forget the latter, democracy is stuffed, and Big Tech has it by the balls anyway.
josephg
> But forget the latter, democracy is stuffed, and Big Tech has it by the balls anyway.
Not everywhere. Here in Vic, Australia, I can return a product for defects any time within its “expected product lifetime”. How long is that? It’s never specified explicitly! So yeah, it kinda doesn’t matter how old a laptop is if the manufacturer pulls stunts like this. You can still give them a headache if you want to.
Europe also has great customer protection laws. And this domain is .NZ - I wouldn’t be surprised if New Zealand has decent customer protection laws too.
The US’s democracy is stuffed. But thankfully the world is much bigger than the United States.
CamouflagedKiwi
Yes, NZ does. The Consumer Guarantees Act is pretty strong - goods must work for a "reasonable period", which is similarly not defined but has generally been upheld as you'd hope by the courts. Companies can't contract out of it.
_heimdall
> proper consumer and privacy legislation.
> But forget the latter, democracy is stuffed
What does consumer and privacy legislation have to do with democracy?
They may both be important, but I see no connection between the two other than the fact that those democratically elected would be the ones making the legislation (and any legislation).
hilbert42
Democracy doesn't work as it should when Big Tech and Big Business pull the strings to get what they want.
When entities other than ordinary citizens get their way—as they do—then citizens are disadvantaged. That ought to be pretty damn obvious, if not then take a look at the world around you.
For starters, examine the myriad of legislation that's beneficial to ordinary citizens that has been blocked or neutered by Big Tech/Business. Citizens may have the vote but they don't hold the power.
_heimdall
Whether it worked under a Democratic system depends on whether the legislation you are concerned with was passed legally.
Democracy would have worked perfectly fine if democratically elected officials made decisions and passed legislation that they were legally allowed to pass. We may disagree with what what passed, bit that's a concern of the outcome rather than the process in which those people were elected.
I very much agree with you with regards to the problems of big tech, big business, and lobbying in general. They are technically operating within the laws created by democratically elected officials, though. That's the problem.
We need a smaller government with less reach and fewer powers. We don't need to claim that those who were democratically elected somehow escaped democracy while working within the bounds of the rules they were given, we need to limit the rules.
danieldk
This seems to be a thing for almost a year?
https://support.google.com/chromebook/thread/286204300/utili...
https://old.reddit.com/r/chrome/comments/1et9y0m/what_is_scr...
This is apparently the source code:
https://chromium.googlesource.com/chromium/src/+/refs/tags/1...
tossandthrow
If the machine is less than 2 years old (and you are in eruope) just return it as defect.
em-bee
laws should be adapted to extend the warranty every time a remote change is made to the device. basically, the warranty should hold as long as the device is maintained. say, each update should come with half a year of warranty. it's a bit tricky as it could motivate companies to stop updating, but that could be solved with a separate law forcing companies to provide an extra number of years to provide updates. (if that doesn't exist already)
juergbi
This may make sense if the extended warranty is limited to defects introduced by the remote change. I.e., if they remotely break your device, they should be responsible for fixing the damage. A full warranty extension doesn't seem reasonable to me, though.
With regards to your last sentence, I think a good first step would be to require at least security and other critical updates to be provided within the full warranty period. And this would make sense even without the (limited) warranty extension, and I actually consider it more important.
em-bee
if the extended warranty is limited to defects introduced by the remote change
yes, of course. it may be hard to distinguish though. the device getting hot may create additional stress on the mainboard or RAM or other parts causing it to break faster.
haswell
I don’t think relying on new laws as the primary incentive is going to get far, especially when big tech has the outsized influence they do on government. This then just leaves a strong incentive to stop updating things.
ginko
Sounds like a good way to make sure that companies drop SW support as quickly as possible.
tossandthrow
Not if it as accompanied by laws about a lower bar for the functioning of software. Eg. regular sec patches etc.
However, this would be a great way of separating hardware and software products - and would that be so bad?
exe34
As long as they release all the source and build tools, I'm okay with that.
TeMPOraL
They already do.
ginko
If OP is in Europe Google could be drawn and quartered for GDPR violations.
kleiba
"could" being the point here - as Joe Bloke, you're not going to get yourself into a legal dispute with Google, but it's not very hard to return an electronic device as an ordinary end consumer when it's still under warranty.
dgellow
Anyone can submit a complaint: https://www.edps.europa.eu/data-protection/our-role-supervis...
Matthyze
That's not how the GDPR works. Just like criminal procedure, subjects do not sue alleged offenders themselves. The state instead sues on their behalf.
nottorp
You don't need to sue Google, just file a complaint with whatever the GDPR authority is in your country ...
mardifoufs
GPDR forbids local OCR? Can you be more specific
hilbert42
Exactly!
rs186
I flashed UEFI firmware on my Chromebook to use Linux. I have been wondering if that's the correct decision, given a number of compatibility issues I have run into on that distro. But seeing this, I know I can live with those issues but not with Chrome.
tjpnz
If op is a Kiwi he should be covered under the Consumer Guarantees Act.
https://www.consumerprotection.govt.nz/general-help/consumer...
bri3d
The source is available: https://chromium.googlesource.com/chromium/src/+/refs/tags/1...
It’s not “training an AI model on screen contents without consent.”
It is a stupid feature for Google to enable by default: likely what’s making OP’s machine useless is that it’s running an OCR inference model on the OP’s images to index them for search.
Go to chrome://flags and disable “Enable OCR For Local Image Search” and I bet the problem goes away. The AI Service does have a few other features, but that’s the one that’s likely to be cooking the machine.
As for the other comments on this thread, I doubt there’s anything to do with GDPR here. It’s all local.
xg15
This also seems crazy to me on a technical level: OP says, this operates on the contents of the screen. So essentially the OS already has the text in memory, renders it to the frame buffer, then OCRs it back - I suppose because it's "cheaper" in dev time to just slap OCR on a screenshot than maybe spend some time looking up what's already accessible in memory and through the UI toolkits.
CPU time is indeed cheaper than dev time, especially if it's your users' CPUs and not yours.
bri3d
Well, kind of, I don’t think it’s that crazy. This service does two things:
* Performs image OCR on images, generically. This is then used for several features: “I type a word in the search box and it can look through my screenshots and photos,” “I’m in one of those horrible scanned image-only PDFs and I want to search,” and so on.
* Performs “main content extraction” on websites by using a screenshot of the website _alongside_ the accessibility tree for that website’s structure. It basically says “given this tree of elements and screenshot, can you prune the tree to just the elements a user would care about.” The fact that this is necessary is more an indictment of the DOM than this feature, IMO :)
xg15
Ah ok, that seems more sensible. My understanding was it would pretty much literally make screenshots and then run OCR on them. If there is enough smartness to only run OCR on the parts that have no text information, it makes more sense. (Though as we see here, even that approach can be too much)
Animats
Is this the "PDF Searchify" feature?[1]
[1] https://windowsreport.com/chromes-new-feature-makes-scanned-...
scq
Looking at the Chromium source, PDF Searchify indeed uses this service (search for "ENABLE_SCREEN_AI_SERVICE").
ur-whale
I'm trying to remember the last time some actually positive and surprising (in a good way) news came out of Google.
I seem to remember a time when they produced one of those every week.
Lately it seems to be mostly the kind of fuck-up and misstep this article talks about.
And I'm not even mentioning those where the misbehaving is actually willful.
meta-level
This is was Microsoft did with their copilot thing, but they knew 95% would not care or even realize..
verytrivial
I switched away from Chrome a few years ago when it became slightly evil regarding trying to own login information. I seem to have made the right choice. (I'm not mentioning what I've switched too because it never results in meaningful discussion.)
pepa65
Mentioning what you switched to could help people that trust your judgment. Just ignore anybody that unfruitfully discusses it..!
exe34
Completely normal for Google. A few weeks ago they bricked my Chromecast. It wouldn't take casting from Netflix anymore, so I looked it up and everybody including Google tells you to reset the piece of crap. I did, and now Google home won't connect to it anymore. Something about a certificate expiring somewhere and now it won't work at all.
I'll try again in a few months and then bin it. I won't be purchasing any further cast/stick devices, I'll simply use a laptop and chrome, and when that stops working, I'll stop using Netflix or Amazon prime video.
This seems like a bug in the ScreenAI service? There's no evidence whatsoever for his claim that Google "trains a machine vision model on the contents of my screen".
According to https://chromium.googlesource.com/chromium/src/+/main/servic... it is just inference.
> These functionalities are entirely on device and do not send any data to network or store on disk.
There is also this description in the Chrome OS source code:
> ScreenAI is a binary to provide AI based models to improve assistive technologies. The binary is written in C++ and is currently used by ReadAnything and PdfOcr services on Chrome OS.