Google Chrome is downloading a 4 GB Gemini Nano model onto users' machines without consent, with no opt-in, no opt-out short of enterprise tooling, and an automatic re-download every time the user deletes it. The pattern is identical to the Anthropic Claude Desktop case I wrote about last month, but the scale is between two and three orders of magnitude larger. This article does the legal analysis and, for the first time, the environmental analysis. The numbers are not small.
Beyond the disk size, what’s wrong with this? Isn’t local machine learning better than shipping your data off to some cloud provider? Or is the problem machine learning?
The AI Mode pill in the Chrome 147 omnibox is a cloud-backed Search Generative Experience surface - every query the user types into it is sent over the network to Google’s servers for processing by Google’s hosted models. The on-device Nano model is not invoked by the AI Mode UI flow at all. They are entirely separate code paths - the most visible AI affordance in the browser does not use the local model the user has been silently given, and the features that do use the local model (Help-Me-Write in <textarea>, tab-group AI suggestions, smart paste, page summary) are buried in textarea-context menus and tab-group right-click menus that the average user will discover, on average, never.
What a double kick to the dick. First, they silently download 4gb to your disk, and they still fucking send your shit to their cloud AI.
The problem, mostly, is that it is installed without informing the user or asking for content and it reinstalls even if you delete it, which is what a malware does. I may not want a local AI work on my laptop on battery while I am working and browsing the internet.
To me this seems arbitrary—Chrome contains countless other binary blobs which you have no insight to and cannot consent to. They are part of the application. Chrome contains other machine learning algorithms and features and has for years, but these have been baked-in. You have likely been using “local AI” or machine learning on your laptop battery for quite some time without being explicitly aware of it.
If people don’t like these features that’s fine, there are lots of alternatives to choose from (personally, I use Helium). But to be upset about this specific instance seems arbitrary to me. And to claim that it’s somehow nefarious (i.e. the consent part) seems disingenuous. Consent is granted when the user downloads and begins using Chrome—why would Chrome need additional consent to download/update one of many external components?
Again, I don’t use Chrome and I’m not interested in this feature, I just don’t see how it’s necessarily bad or evil all things considered.
If you read the article it will summarize very succinctly everything wrong with it. It is illegal for a variety of reasons in some places (Europe, California) it’s wasteful, it means 4 fucking GB of data unrequested which can be a problem in metered connections in many places. 4GB. This model is not essential for chrome to function as a browser at all, and in all likelihood unless you use it for generating text then you probably are not even using it.
Beyond the disk size, what’s wrong with this? Isn’t local machine learning better than shipping your data off to some cloud provider? Or is the problem machine learning?
They’re absolutely shipping all your local data up to their cloud.
It’s likely (and one of the reasons I don’t use Chrome) but its not the discussion we’re having.
Per Passerby6497’s comment above:
What a double kick to the dick. First, they silently download 4gb to your disk, and they still fucking send your shit to their cloud AI.
The problem, mostly, is that it is installed without informing the user or asking for content and it reinstalls even if you delete it, which is what a malware does. I may not want a local AI work on my laptop on battery while I am working and browsing the internet.
To me this seems arbitrary—Chrome contains countless other binary blobs which you have no insight to and cannot consent to. They are part of the application. Chrome contains other machine learning algorithms and features and has for years, but these have been baked-in. You have likely been using “local AI” or machine learning on your laptop battery for quite some time without being explicitly aware of it.
If people don’t like these features that’s fine, there are lots of alternatives to choose from (personally, I use Helium). But to be upset about this specific instance seems arbitrary to me. And to claim that it’s somehow nefarious (i.e. the consent part) seems disingenuous. Consent is granted when the user downloads and begins using Chrome—why would Chrome need additional consent to download/update one of many external components?
Again, I don’t use Chrome and I’m not interested in this feature, I just don’t see how it’s necessarily bad or evil all things considered.
I am upset because I am aware of this one. How can I be upset about something I am not even aware of?
The user should be informed about what they are getting. By that logic Chrome csn also install and run a crypto miner.
If you read the article it will summarize very succinctly everything wrong with it. It is illegal for a variety of reasons in some places (Europe, California) it’s wasteful, it means 4 fucking GB of data unrequested which can be a problem in metered connections in many places. 4GB. This model is not essential for chrome to function as a browser at all, and in all likelihood unless you use it for generating text then you probably are not even using it.