[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/g/ - Technology


Thread archived.
You cannot reply anymore.


[Advertise on 4chan]


File: unnamed.png (18 KB, 512x512)
18 KB
18 KB PNG
Please tell me about your experience running LLMs locally any recommendations?
>>
>>107496180
Use docker it makes things easier
>>
Why do you need it local, Anon? What are you up to?
>>
>>107496180
Don't do it.
>>
>>107496180
>Please tell me about your experience running LLMs locally
The experience is like having an autistic super-human slave who does whatever you want

>Any recommendations
DeepSeek + Qwen
>>
Local LLMs are getting too good, they used to be years behind now they catch up to the newest SOTA releases in a matter of weeks. This is why jews are making hardware unaffordable for normal people, they thought they would have a monopoly on AI and they were left with the only option to panic buy all thr ram
>>
Ive been taking to a 8b model and its surprisingly capable.
>>
>>107496180
>experienced
More retarded than cloud models but slightly easier to jailbreak (which only requires a double digit IQ anyway so not a low bar)
>recommendations
They're all shit, finetunes are just a different distribution of slop
Just go to /lmg/
>>
>>107496192
Nothing silly, I just don't like cloud services
>>107496212
Any elaboration?
>>
>>107496180
If you ONLY care about running local language models and absolutely nothing else (seriously, ROCm is dogshit), buy an AI Max+ 395 mini pc with 128GB RAM. Otherwise enjoy OOMing on anything more than 8B param models.
>>
File: file.png (80 KB, 870x863)
80 KB
80 KB PNG
>>107496180
They're not smart but playing around with uncensored AIs is fun. It feels good to have a computer actually follow my orders.
>>
>>107496180
It’s fantastic. make sure you have the nvidia-container-toolkit for your containers (assuming you have a nvidia gpu)
>>
File: be9-2001459920.jpg (15 KB, 200x250)
15 KB
15 KB JPG
I consider myself pretty adept at language, higher than average. I can write, I can code, and I enjoy figuring things out. The best use I've had for it is to bounce ideas off, because I know I can come up with something better than even the most advanced LLM to date. It aims for average, slap bang in the middle of predictable, and while it hits the mark evey time; I'm not about that
>>
>>107496180
>any recommendations?
yeah, just install Alpaca and pick the trending model.
https://flathub.org/en-GB/apps/com.jeffser.Alpaca
>>
running local on iphone 13 kinda sucks as ngl don’t do it. it’s like a fun little toy but the phone get really hot. using privatellm
>>
>>107496279
>Any elaboration?
Yes.
>>
>>107496180
i ran one of the distilled deepseek models on my gaming laptop. it runs fine but since it's the distilled model, its too retarded to rely on for anything but some laughs
>>
can you do loli roleplay?
>>
>>107496279
Silly anon. That faggot cockmongler has no elaborations besides 'MOAR GIBBS NAO'. He's basically the kike form of a nigger.
>>
>>107496180
>>107493611



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.