Google Chrome is downloading a 4 GB Gemini Nano model onto users' machines without consent, with no opt-in, no opt-out short of enterprise tooling, and an automatic re-download every time the user deletes it. The pattern is identical to the Anthropic Claude Desktop case I wrote about last month, but the scale is between two and three orders of magnitude larger. This article does the legal analysis and, for the first time, the environmental analysis. The numbers are not small.
The download is the title and what everyone is latching onto, but few are seeing the other problems, like how it secretly installed that model without user acceptance, how it uses obscurity to hide the model, how it will reinstall if you just delete it (fortunately there’s an uninstall process linked in the comments, does that include uninstalling Chrome?). And then how it pretends to be an extra AI thing on the browser but apparently will be used for any searching. Which is more energy use since it isn’t local, it’s just using the weights in storage.
It’s all bad, even if it wasn’t AI. It’s what malware does.
Unnecessarily long article (which says “4GB” 33 times, and the complete phrase “4GB AI model” ten times)… Once or twice was all I needed.
But the article author(s) came across a good point. If pushed out to ~15% of Chrome users without consent:
And that’s just for the initial data push. Models need ✨updates!✨
The download is the title and what everyone is latching onto, but few are seeing the other problems, like how it secretly installed that model without user acceptance, how it uses obscurity to hide the model, how it will reinstall if you just delete it (fortunately there’s an uninstall process linked in the comments, does that include uninstalling Chrome?). And then how it pretends to be an extra AI thing on the browser but apparently will be used for any searching. Which is more energy use since it isn’t local, it’s just using the weights in storage.
It’s all bad, even if it wasn’t AI. It’s what malware does.