You MOTHERFUCKEEEEEEEEEEEEEEEEEEEEEEERS.How do we solve the RAM crisis?Good gaming should have 32 GB, Workstation (heavy compilation, DB, calculations) 64 GB, and good home-server + simulations best should have 512 GB (4x128)
any outlook for RAM pricers returning near normal in 2026, or I should just go ahead and rope?What about them chinks making new FABs?Or any tech on the horizon to make LLM use less of VRAM?Or, AI centers sit on unusable hardware (they can't get the power grid to connect it) maybe they can slow down a bit?
>>107824595>any outlook for RAM pricerssure, after last year's increase we just got another small increaseoutlook for the near future is another increasebecause why not, it's not like you can do without ram, are you just not gonna buy? nah
I was hoping that would slup pc sales and GPU's get cheaper. But they already slowed production 30-40% and cancelled the refresh.
>>107824720GPUs need RAM too, so a RAM crisis tends to make them more expensive.
>>107824584ChatGPT uses hardly any RAM. What do I need more for?
>>107824584The Xbox 360 had 512MB of system memory and it could run Halo 3 plus network and social features at the same time. You don’t need more than that.
>How do we YOU won't do shit, as usual. you'll just ask others to do it for you>how do we force AI datacenters to stop using ramlol, you're a stupid little gamer, huh?
>>107825991>lol, you're a stupid little gamer, huh?oh yeah that's why I pointed out 512 GB need for heavy VMs, you little faggoty fag
>>107825887>ChatGPT uses hardly any RAMIncorrect. Before i upgraded mine it would regularly freeze up the entire browser - After dropping the new brick in, it's noticably faster and freezes up rarely anymore