i truly doont get why linux tranies are so obsessed with their ram usage. like i get it youre mentally ill and cannot have a job thus cant afford modern hardware thus have to penny pinch ram usage cuz youre still on 2gb of ram. but the rest of humanity really doesnt give a shit esp since win 11 has adaptive ram usage so pic related being posted is pointless but you wouldnt know that you ignorant fucks
>>108728854I agree but why is your ram running at slow normie jedec speeds? Too stupid to even turn xmp on in bios let alone tune?
>>108728955that is max speed
>>108728955Yes—but usually only a little, and it depends on the workload.For most AI use:If the AI is running mainly on the GPU (graphics card), faster system RAM barely matters.If the AI is running on the CPU or using shared memory (like some laptop iGPUs), RAM speed can matter more.Examples:Gaming + AI features: tiny differenceLocal LLMs on CPU: noticeable but still not dramaticIntegrated graphics / no dedicated GPU: can matter a lot moreTraining models on a strong GPU: VRAM matters far more than regular RAMRough idea:Turning on XMP might give:0–3% improvement in many casesSometimes 5–10%+ if the task is memory-sensitiveSo the insult is often more about “you don’t know your hardware” than some huge real-world speed boost.
>>108728955>normie jedec speedsXMP is “enthusiast tier” and fine if you’re a gamer who can tolerate the increased risk of errors. A lot of workstations, especially those where you run ECC ram will not even provide such options in the BIOS. You’d violate ‘enterprise validation’ and if you’re running compute heavy tasks for long periods, any error is intolerable.
>>108729026none of that applies to you THO
>>108729030I’m not OP, but it absolutely does.
>>108729036post it
>>108729036well?