https://www.thatprivacyguy.com/blog/chrome-silent-nano-install/Google Chrome is reaching into users' machines and writing a 4 GB on-device AI model file to disk without asking.Chrome installs a file called weights.bin (~4 GB) in a hidden folder (OptGuideOnDeviceModel).This file contains the Gemini Nano on-device AI model.If you delete the file, Chrome reinstalls it automatically.
>Complains about a 4GB file>Posts screencap of a 12KB fileHuh
>>108763850Vivaldi doesn't have this problem :)
>>108763850sorry not a chromejeet, but sorry to hear that bro, keep using shitty software
>>108763958I think this was a sort of mobilization/awareness call more than a generic flamewar thread
WHO BRAVE?!?
so is the model good?
I'm surprised it took them this long. I never trusted Gargle with my bits and bytes. Who Waterfox master race up in here?
Imagine installing jewgle malware
>>108763850So if I were to install Chrome on my Chromebook it would take up a third of the hard drive
>>108763850I download 4k linux distros without consent and theyre over 40gb each
>>108763850>without consentyou didn't read the terms and conditions
>>108763850I don't use chrome but what if I delete the file, touch the filename as root and remove write access to the file for a normal user account. The file will be there so it can't make a new one but the user won't have permission to change the file.
>>108763997it's so called gemini nano
I don't use chrome for much, just random shit that refuses to work right in FF. Deleted everything in the directory and chownd it as root. Chrome can't write to it so it can get fucked.
>>108763850LMFAO
>>108763850>local AIUhh based? they care about privacy
>>108763850Ungoogled Chromium doesn't have this problem.
>>108764893>they care about privacy
>>108763992Brave will copy this in 2 weeks.
>>108763850Why is orange reddit defending this so vehemently?
Most mobile users still pay by the Gigabyte, Google is probably going to have to pay out big now. And 1GB plans are still common at the ultra low end.
Local AI: based.Forcing me to DL a 4GB model I don't fucking want: go fuck yourself.Simple.
Trannyfox has an AI kill switch thoughBEIT
Is this going to fuck with Supermium at all?
so what kind of LLM is this exactly ? what inference engine are they using ? why is it a .bin and not something normal like a .gguf ? how many parameters & which quantization ?
>>108767197>orange redditIs this a specific website I'm not catching a reference to or do you refer to every website by it s primary scheme color as in epithetanyway it's weird, every single thing on reddit that's even mildly critical of ai gets blammed into oblivion. It's incredibly unsubtle, I'm just surprised at how thorough it is
>>108763850Can 4GB local models do anything? I thought I'd need 32GB on a fancy GPU before considering local models.
>>108763850Ok. What are you going to do about? Little bitch.
>>108763850>muh 4 gigas!>muh consent!>help I'm being gigabyte-raped!I don't see why people are upset that Google is giving out more model weights. Anyone figure out how to run inference with it outside of chrome?
>>108764965This!
>>108772500You're way behind the times. Qwen has a 0.8B text and vision model that is less than 1GB. I'll say it again, text AND vision, it can fucking read images and knows what you're looking at.
>>108772332>every single thing on reddit that's even mildly critical of ai gets blammed into oblivionYou must have browsed some bizarro reddit then