Please tell me about your experience running LLMs locally any recommendations?
>>107496180Use docker it makes things easier
Why do you need it local, Anon? What are you up to?
>>107496180Don't do it.
>>107496180>Please tell me about your experience running LLMs locallyThe experience is like having an autistic super-human slave who does whatever you want>Any recommendationsDeepSeek + Qwen
Local LLMs are getting too good, they used to be years behind now they catch up to the newest SOTA releases in a matter of weeks. This is why jews are making hardware unaffordable for normal people, they thought they would have a monopoly on AI and they were left with the only option to panic buy all thr ram
Ive been taking to a 8b model and its surprisingly capable.
>>107496180>experiencedMore retarded than cloud models but slightly easier to jailbreak (which only requires a double digit IQ anyway so not a low bar)>recommendationsThey're all shit, finetunes are just a different distribution of slopJust go to /lmg/
>>107496192Nothing silly, I just don't like cloud services>>107496212Any elaboration?
>>107496180If you ONLY care about running local language models and absolutely nothing else (seriously, ROCm is dogshit), buy an AI Max+ 395 mini pc with 128GB RAM. Otherwise enjoy OOMing on anything more than 8B param models.
>>107496180They're not smart but playing around with uncensored AIs is fun. It feels good to have a computer actually follow my orders.
>>107496180It’s fantastic. make sure you have the nvidia-container-toolkit for your containers (assuming you have a nvidia gpu)
I consider myself pretty adept at language, higher than average. I can write, I can code, and I enjoy figuring things out. The best use I've had for it is to bounce ideas off, because I know I can come up with something better than even the most advanced LLM to date. It aims for average, slap bang in the middle of predictable, and while it hits the mark evey time; I'm not about that
>>107496180>any recommendations?yeah, just install Alpaca and pick the trending model.https://flathub.org/en-GB/apps/com.jeffser.Alpaca
running local on iphone 13 kinda sucks as ngl don’t do it. it’s like a fun little toy but the phone get really hot. using privatellm
>>107496279>Any elaboration?Yes.
>>107496180i ran one of the distilled deepseek models on my gaming laptop. it runs fine but since it's the distilled model, its too retarded to rely on for anything but some laughs
can you do loli roleplay?
>>107496279Silly anon. That faggot cockmongler has no elaborations besides 'MOAR GIBBS NAO'. He's basically the kike form of a nigger.
>>107496180>>107493611