Why is most AI shit online-only?
You can run a lot of AI stuff on your own hardware offline, check the catalog for /lmg/ and /ldg/ for a starting point.
I'm glad I jumped off of fedora this year, I'm just sticking to debian now
>>108008299even the local apps are invasive with telemetry or saas grifting. we need a projects that don't touch the web at all
>>108008299as opposed to?you want gemini to to fold your laundry and gpt to do your dishes? robots will be too expensive for most, you're better off hiring a mexican nana to do those for you
Why do people make OPs with images that are only tangentially related to the topic?
>>108008355As opposed to running locally, where you don't have to constantly buy more tokens.
>>108008337https://rentry.org/IsolatedLinuxWebService or running the program with unshare -r -n will castrate all the telemetry, if you don't want to see saas shilling don't use comfyui and use forge neo or sd.cpp instead, llama.cpp or kobold.cpp for text.
>>108008564you have crap ton of local models what are you talking about
>>108008299>online-onlyThe same reason most games today require online connectivity, or microslop forcing you to use an onlne account: It's easier to monetize.
>>108008299llms require 1tb of ram
>>108008299Bye bye quality!
>>108008321>I'm glad I jumped off of fedora this year, I'm just sticking to debian nowGood, so you just wait 2 years, for the AI code to be merged on debian. 90% of programmers already use AI assistance, the good thing about Linux and FOSS in general is: people contribute out of passion, only a few elite programmers are paid and they don't need AI. Chill anon.