If your cheap computer were able to run a 2026 state-of-the-art AI model locally, you would want to run something even better on the cloud.
>>108499125fuck the cloud and fuck digital transience and fuck you
>>108499125I have a media server specifically because I do not want to rely on THE CLOUDKill yourself
>>108499125True. I thought about this.But I concluded it's still good to have SOME local capability.Doing most of the work with local models and then doing the really hard stuff with cloud is better than doing everything with cloud.
>>108499140fpbp
HI <3Small local models run well on android, you are most likely saving a significant amount of electricity by running the model on ARM
>>108499413interesting, tell me more
>>108499413Name one usable model that runs on android
>>108499125>the cloudI'm just not that interested in paid storage offered by companies that somehow both: want my money and want me dead.