[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/g/ - Technology

Name
Options
Comment
Verification
4chan Pass users can bypass this verification. [Learn More] [Login]
File
  • Please read the Rules and FAQ before posting.
  • You may highlight syntax and preserve whitespace by using [code] tags.

08/21/20New boards added: /vrpg/, /vmg/, /vst/ and /vm/
05/04/17New trial board added: /bant/ - International/Random
10/04/16New board for 4chan Pass users: /vip/ - Very Important Posts
[Hide] [Show All]


[Advertise on 4chan]


File: usecase 32gb.png (64 KB, 1232x697)
64 KB
64 KB PNG
use case for more than 32GB RAM? 99% users wouldn't need beyond 32GB.
>>
>>107699394
Fitting your mom's ass on a single screen.
>>
>>107699416
ha! owned!
>>
>>107699394
LLMs, filesystem mirrored in RAM, compiling Firefox with enough threads to make memory usage inflate.

Pretty soon every desktop is going to have a local LLM standard that performs some basic services, but the RAM industry needs to normalize first.
>>
File: file.png (64 KB, 705x600)
64 KB
64 KB PNG
>>107699394
use case for more than 16GB of RAM?
>>
>>107699491
big tech would never want local LLMs. they want everything on centralized cloud servers where the user never has useful task offline. So that is unlikely outside of certain people in FOSS realm
>>
>>107699537
Too late. They already exist. It's happening whether anyone likes it or not.
>>
>>107699394
>use case for more than 32GB RAM?
Niggerlicious games and frivolous AI experimentation. Latest server I provisioned for a 50+ employee law firm is a dual Xeon setup running Oracle Linux. 16TB RAID array and 32GB RAM. Handles their VPN and local DNS, a building-wide backup system, and dozens of samba shares. Rarely gets past 10GB memory consumption, and that's only when it's being absolutely pounded for extended periods of time.
>>
File: task.jpg (50 KB, 403x443)
50 KB
50 KB JPG
>>107699529
use case for more than 128MB of RAM?
>>
>>107699394
>>107699529
Any 3D rendering software, 4k res artworks with many layers in PS.
Also gaming at 1440p, Space Engineers eats ~50gigs on my save.
>>
>>107699659
That pic has 2GB thougheverbeit
>>
>>107699666
>3D rendering
>4k res artworks
Not a use case. Buy a paintbrush and some clay.

>gaming
Definitely not a use case.
>>
>>107699689
Alright man, I'll just sell my PC then.
>>
>>107699696
Thank you. I will buy it for $500. You can find me on discord @getonmylevelpleb
>>
>>107699537
> big tech would never
Hardware-wise, you can already run some pretty big local LLMs on Strix Halo (AMD), Mac Studio (Apple), and DGX Spark (Nvidia). Intel plans on selling something similar, too. Heck, even a 64GB Mac Mini can run some reasonably large models. For smoler LLMs, just use a GPU.
As for the actual local LLMs, maybe give Llama (Facebook), Gemma (Google), and GPT-OSS (OpenAI) a try, among others.
>>
>>107699394
I have 24
>>
>>107699529
why does chrome_crashpad_handler need 32gb of ram
>>
>>107699394
>Windows 11
>Shittel
>No XMP
>Giving advice
Please go back, just stick to your phone and buy a switch 2.
>>
>>107699537
Stupid fucking retard lmao



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.