OH OH OH NO NO NO HAHAHHA
Get with the times, gramps.
>>1017211813060 has 12gbsahjhahahaanvjew
>>101721181AI FAGS BTFO!
>>101721181>buying nvidia>in 2024
>>101721181If your car doesn't have 16GB in 2024, its useless for real world. All you get is goyslops <16GB. Even 16GB isn't enough in the coming future as you would want >200GB VRAM
>nvidia refuses to add more vram>amd can't into AI>intel flatlinednow what?
>>101721562Manually add more vram via soldering and shady chinesium drivers
>>101721562CHINA SAVE US
>>101721527mf I just play rpcs3 games and dmc5 on my pc
>>101721585i wish this was possible with the 3090, i have two of them
>>101721562>amd can't into AIGood. That way faggots like you will ignore them and I can safely buy good GPUs for decent price to play my vidya. And you can pay extra for 8gb card in 2025 and whine how every other card is shit because they don't support your completely irrelevant money bubble.
>>101721562>>101721594>>101721585As joke-y as it sounds are options are pretty much Chinesium hacks, or China manages to clone CUDA, or some software hacks to use CUDA on AMD like zluda or scale.Certainly a sad state of affairs.
>>101721585Cyberpunk seal of approval
>>101721837You can
I get why a 5060 should be cut down, but then why give it GDDR7?
>>101721181>that bus>that VRAMgoyim will still buy droves of them them because they get prebuilds
>>101721562>muh AIwho cares bro
>>101721181>$2000 starting price
>>101721181Enshittia (TM)
>>101721527hoping to snag a 7900xtx deal. for now, I will rock my 6950
>>101721181My old GTX 1070 from 8 years ago is 8 GB VRAM, what are they doing?
same memory size as my 8yo rx570
>>101722193servers store your prooompts. it is what it is, but I prefer some privacy.
>>101722234They jewmax Cuda core count and now they're doing the same with Vram.It's a double jew sword.
>>101721585They'll lock this down with firmware btw. Already did this with the 3090 (not sure about 4000 series)
xx60 and xx70 will forever be clueless retard poorfag tier
>>101721181GPUs at this point should have base ram of 16gb and high models should be nearing 32gb. This is just lazy moneygrabbing at this point.
>>101721562>can't aiThank god, "AI" is stupid.>intel flatlinedThey are literally fine. Only issue is they are still young in the gpu game so don't expect anything for 10 years.
>>101721191>RTX 9090>still 24GB VRAM
>>101721191No, U
>>101721921i have no idea where to even start, any instructions anywhere? i couldn't find a single account of anyone on the internet successfully doing this
>>101721181>5060>8gbThey wouldn't dare
>>101722821>22GB GDDR9 64bit bus
>>101722952There's no instructions and no one has done it successfully. The 2080 22GB one has documented evidence at least.Without proof from the author it's probably just clickbait.
>>101721562>amd can't into AIdnf install -y \ hipblas \ rocm-hipexport HSA_OVERRIDE_GFX_VERSION=10.3.0ollama serveTook me half an hour on my first tryt. RX 6700 XT
dnf install -y \ hipblas \ rocm-hipexport HSA_OVERRIDE_GFX_VERSION=10.3.0ollama serve
Nvidia is worse than Apple at this point
>>101721181>retards expected 5060 to have 10 or 12GB of RAMdelusional
>3060 will be 4 years old by the time this releases>had a model with 12GBmoore's law isn't just dead, it's going backwards
>>101725130
>>101721181Once the NVDA ticker goes sub $50, they'll panic and release high vram super models with 16-36GB of vram
>>101722795>Thank god, "AI" is stupid.You are stupid.
>>101721944First round of gddr7 should be about 80% faster than gddr6x, so they’d rather go with gddr7 and smaller bus width. Gddr7 is a beast of an upgrade and benefits the intended use of the product while the reduced memory hinders the performance of other uses for which nvidia has enormously more expensive products.
>>101721181Nvidia should unironically be forced to license cuda to the competition.
>>101721181>128 bit busThat seems too much, you gamers don't need more than 96t. Jensen
>>101725309Or Khronos group could make a modern OpenCL successor with support for MP shit, like Vulkan is a successor to OpenGL
>>101725352>MPML*
>>101721181Do you, retards, know that VRAM isn't free, do you? If they doubled it then you would be whining the cards now are twice as expensive. Be fucking grateful of what they are giving us and stop whining. You're acting like entitled bitches.
>>101725450Das rite, Nvidia is just a small company trying to make a buck to stay afloat, they are only worth more than Canada and Germany, they haven't even surpassed the USA yet
>>101725450>people fucking believe that
>>101721181>GPU get faster>more vram is processed on and off the chip within a time frame with the same vram size>BUT I NEED MORE VRAM!!where are those 8k res games?
>>101725468>worth more than canada and germanywhat kind of retards are hereeGDP canada 2.2 trillionGDP germany 4.2 trillionTurnover Nvidia 70 billionif nvidia makes a turnover of 2 trillion a year, let's talk again
>>101725450>the cards now are twice as expensive.but they are
>>101725450not that, but a new 2gb module (there are a few of them on your card) that is installed costs them around 3$ to buy. for 50$ more they could easily double the vram, but then they would attack their own professional market and thus their profit :>why do that if people have no choice but to buy your crap
>>101722952It's easy, you just desolder the 16 Gb GDDR6X chips and solder on 32 Gb GDDR6X chips.
>>101721562AMD could do AI since the 6000 series but only for Linux.Luckily the OS filtering means I could do AI for less money.
>>101725595>gpu gets fasterlmao
>>101724099Obviously it's possible, you retard, but the performance is inferior.
>>101725726Where would you put them? The pcb is already fully populated on both sides.
>>101725783anon your cherrypicked sample doesnt change the fact
>>101721181What the fuck is their problem?
>>101725886>bro just turn on framegen doodlmao
>>101722193>muh vidyafuck off
>>101721562You know that 5060 intended market doesn't give a hoot about AI meme and 8GiB is more than sufficient for their use case (1080p/1440p).Newsflash, 8GiB or more is only really needed for 4K and fancy texture packs in select titles. This will remain the case until next PlayStation-Xbox. Consoles control the baseline for developers.
>>101722234The intended thing of jewmaxxing
>>101721181man i’m still rocking my ASUS STRIX ROG 1070 and an overclocked intel 4790k on a Z97. You fucking conzoomer retards are the reason everything is fucking shit - why bother making shit that actually fucking works AND lasts when you’re all going to slurp up whatever steaming pile of horseshit is served to you
>>101721181is SLi still a thing?
>>101722786Nvidia wants to protect their AI focused line-up and gaymes don't care that much for VRAM despite the memes. VRAM is only problem with 4K and beyond with some fancy texture packs in selected titles. The only people who are bitching about "MAH VRAM" are poor-fags AI-types who can't afford Nvidia's intended SKUs.
>>1017222348GB is still perfectly fine. The software side of things hasn't moved pass it. Anyone whining about wanting more VRAM are likely people who use programs that would benefit from more VRAM, but that's it.
>>101721562Apple silicon with its unified memory is pretty based.
>>101721181Unironically, what is stopping Intel or AMD to offer GPUs with 48GB of VRAM for 300 bucks?
>>101722193AI was the main reason I upgraded in 2022 from my 2015 computer. 2060 12GB. I refuse to ever by another video card with less than 16GB VRAM, maybe even 24GB.
>>101725926>>101725980>8gb is enough for 1080pYou can't even afford a leather jacket lmao
>>101726052They'd be selling at a loss and nobody would buy them because consoomers prefer the 5060 8gb since it has the Nvidia name. I mean look at arc, Intel is offering a lot of silicon and vram for cheap, even kosing money on them and people wourk still rather buy a 4060 8gb over any of the arc cards.
>>101725130Moore's law has nothing to do with memory capacity.
>>101725450>If they doubled it then you would be whining the cards now are twice as expensive.Graphics cards have basically doubled in price since 2018/2019.
>>101726107The Arc's are no cheaper; $270 for an 8GB A750 when you can get a 12GB 2060 for $300. $470 for a 16GB A770 when you can get a 16GB 4060 for $450.
>>101726082>Max detailsRetard, the intended market will have their stuff set to auto which would be at medium-high detail settings. Max detail is literately just lossless textures which barely look better than medium-high textures.
>>101726052No need for 48GB VRAM in gaming. 12GB was a must last gen because Nvidia's new gimmick demands more VRAM than without yet Nvidia didn't give more VRAM to anyone then Nvidia's latests also had a newer gimmick that demands more VRAM and once again Nvidia didn't increase VRAM and in fact cut the VRAM to the xx60 class card. That truly was a fuck you to the gamers. I was going to say it was a fuck you to low end gamers but Nvidia was asking $400 for the 4060 and the even higher for the 3060 due to covid. I can't wait for Nvidia to crash and burn.
>>101726107I bought a 980ti from ebay for $100+tax because it has the same performance as the arc770, $200 cheaper, and mature drivers with better driver support (and not crashing) for older games
>>101726052High capacity GDDR chips aren't cheap nor are needed for gayming use cases.Almost all of the "MAH VRAM" noise is coming from poor-fag AI-types who can't afford Nvidia's intended SKUs for the AI meme. Almost no gamyer is complaining about the lack of VRAM capacity.
>>101726149Imagine paying $400 for a card that can't even max games in 1080p in 2024 lmao at your life
>>101726052VRAM isn't that useful for games. Nvidia are still being cheap jews, but you don't really need more than 16gb of VRAM for most games, even in 4k, which is what both Intel and AMD offer for a fair price. VRAM is mostly useful for stuff like AI, which is very cuda dependent, so Nvidia has a monopoly of it and can gauge prices. If Nvidia was forced to license cuda, suddenly GPUs would start shipping with twice or thrice as much VRAM for the same price.
>>101721562Go outside and have sex.
>>101725596>TurnoverWtf that's not how you spell revenue
>>101726116>memory capacity has nothing to do with transistor density
I get the AI chat, but 8GB is retarded even for gaming.
>>101722837
>>101726260That is correct, yes.
>>101725980>>101726178This. H100s aren't even that expensive. Around 25k. You can get an A100 for even less. How fucking poor you have to be that you cannot save 25k? What is it about AI that is so attractive to poorfags and pajets? RTX cards are toys for gaming. 8GB of VRAM is a bit too little, but 12GB would be more than enough.
>>101726292extremely obvious samefagging, you should change your writing or punctuation style when you reply to your own posts to make it harder to spot>in b4 inspect element screenshot reply
im on 3060 Ti what card do i get now?
>>101726327You will notice you didn't call me a liar.
>>101726340An used 3090 is still by far the best value you can get.
>>101726347>he doesn't even deny that he's samefaggingI respect the honesty
>>101721873nvidia users are unironically like those annoying applefags of the 00s who talk about they're totally gonna do something "creative" now that they have a mac, but instead it's how they're totally gonna learn AI with their overpriced gimped gpu
>>101726191Blame the smaller whales who are happy to pay such prices. Only utlra poor-fags having a hissy fit over it. The days of $199 on discrete SKUs are gone and never will come back
>>101726434Setting up SD is trivial
>>101721181>5060I'm not a yuropoor or poorfag, I won't be buying this version
>>101725596>doesn't understand market cap/g/tard moment
>>101726458genuine poorfags never bought nvidia in the first place
>>101721562USGOV steps in and partially licenses Nvidia patents to other chip makers for 1% of what Nvidia could have negotiated if they weren't so hebrew, for national security "we gotta beat the chicoms" reasons. There is no good reason VRAM can't be removable like RAM. If government attacked patent holders who hold back, maybe we could have something nice for a change...
>>101725130That has nothing to do with moore's law, it's just retardation.
>>101725271Okay zoomie
>>101726170>980tiIt's the last GPU that supports XP natively. Best card for retro rigs. Once they're gone, they're gone.
>>101721527Name a single game worth playing that requires 16gb+ VRAM
>>101721527i just play old games ( made before 2020 ). apparently this is all i need.
>>101727231>gameNGMI
>>101726137Yes but the 4060 uses a gimped 128 bit memory bus where the a770 uses the standard 256 bit
>>101725450Retard logic that doesnt understand the cost of VRAM. Its only ~$30 for 8GB VRAM. Not $300 extra. Dumbass.
>>101727231Skyrim with sex mods
>>101721502i only buy intel. intel processors. intel gpus
>>101725450I bet you also buy Apple products lmao
>>101727231I mean, I'm sure cyberpunk modded out and a bunch of other games modded out + upcoming games easily could.BUT, gpu's aren't just for gaming fag
Just get an AMD? Cheaper for same performance with way more VRAM. Can't do AI but who gives a shit
why don't they just make cards that let you socket in how much vram you want?worked for motherboards.
>>101727497You can do AI slop, it's just slower at making the images
>>101727581Because that requires them being better companies and hurts stock holders
>>101727059The have to let investors finish the great NVDA pump and dump first.
8GB of VRAM ought to be enough for everybody
>>101726827He's comparing GDP with revenue, which doesn't make sense, but it's still a lot less retarded than comparing GDP with market cap.For example, my country has a company (Novo Nordisk) with a market cap higher than the GDP.
>>101727231If you need them named then you don't even game