[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip / qa] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/g/ - Technology


Thread archived.
You cannot reply anymore.


[Advertise on 4chan]


File: Nvidia.jpg (58 KB, 650x848)
58 KB
58 KB JPG
OH OH OH NO NO NO HAHAHHA
>>
File: 9090.jpg (77 KB, 586x421)
77 KB
77 KB JPG
Get with the times, gramps.
>>
>>101721181
3060 has 12gbs
ahjhahahaa
nvjew
>>
>>101721181
AI FAGS BTFO!
>>
>>101721181
>buying nvidia
>in 2024
>>
>>101721181
If your car doesn't have 16GB in 2024, its useless for real world. All you get is goyslops <16GB. Even 16GB isn't enough in the coming future as you would want >200GB VRAM
>>
>nvidia refuses to add more vram
>amd can't into AI
>intel flatlined
now what?
>>
>>101721562
Manually add more vram via soldering and shady chinesium drivers
>>
>>101721562
CHINA SAVE US
>>
>>101721527
mf I just play rpcs3 games and dmc5 on my pc
>>
>>101721585
i wish this was possible with the 3090, i have two of them
>>
>>101721562
>amd can't into AI
Good. That way faggots like you will ignore them and I can safely buy good GPUs for decent price to play my vidya. And you can pay extra for 8gb card in 2025 and whine how every other card is shit because they don't support your completely irrelevant money bubble.
>>
>>101721562
>>101721594
>>101721585
As joke-y as it sounds are options are pretty much Chinesium hacks, or China manages to clone CUDA, or some software hacks to use CUDA on AMD like zluda or scale.
Certainly a sad state of affairs.
>>
>>101721585
Cyberpunk seal of approval
>>
>>101721837
You can
>>
I get why a 5060 should be cut down, but then why give it GDDR7?
>>
>>101721181
>that bus
>that VRAM
goyim will still buy droves of them them because they get prebuilds
>>
>>101721562
>muh AI
who cares bro
>>
>>101721181
>$2000 starting price
>>
>>101721181
Enshittia (TM)
>>
>>101721527
hoping to snag a 7900xtx deal. for now, I will rock my 6950
>>
>>101721181
My old GTX 1070 from 8 years ago is 8 GB VRAM, what are they doing?
>>
same memory size as my 8yo rx570
>>
>>101722193
servers store your prooompts. it is what it is, but I prefer some privacy.
>>
>>101722234
They jewmax Cuda core count and now they're doing the same with Vram.
It's a double jew sword.
>>
>>101721585
They'll lock this down with firmware btw. Already did this with the 3090 (not sure about 4000 series)
>>
xx60 and xx70 will forever be clueless retard poorfag tier
>>
>>101721181
GPUs at this point should have base ram of 16gb and high models should be nearing 32gb. This is just lazy moneygrabbing at this point.
>>
>>101721562
>can't ai
Thank god, "AI" is stupid.
>intel flatlined
They are literally fine. Only issue is they are still young in the gpu game so don't expect anything for 10 years.
>>
>>101721191
>RTX 9090
>still 24GB VRAM
>>
File: nvidia.jpg (509 KB, 1080x1002)
509 KB
509 KB JPG
>>101721191
No, U
>>
>>101721921
i have no idea where to even start, any instructions anywhere? i couldn't find a single account of anyone on the internet successfully doing this
>>
>>101721181
>5060
>8gb
They wouldn't dare
>>
>>101722821
>22GB GDDR9 64bit bus
>>
>>101722952
There's no instructions and no one has done it successfully. The 2080 22GB one has documented evidence at least.
Without proof from the author it's probably just clickbait.
>>
>>101721562
>amd can't into AI
dnf install -y \
hipblas \
rocm-hip

export HSA_OVERRIDE_GFX_VERSION=10.3.0

ollama serve


Took me half an hour on my first try

t. RX 6700 XT
>>
Nvidia is worse than Apple at this point
>>
>>101721181
>retards expected 5060 to have 10 or 12GB of RAM
delusional
>>
>3060 will be 4 years old by the time this releases
>had a model with 12GB
moore's law isn't just dead, it's going backwards
>>
File: 1642880883670.jpg (85 KB, 400x400)
85 KB
85 KB JPG
>>101725130
>>
>>101721181
Once the NVDA ticker goes sub $50, they'll panic and release high vram super models with 16-36GB of vram
>>
>>101722795
>Thank god, "AI" is stupid.
You are stupid.
>>
>>101721944
First round of gddr7 should be about 80% faster than gddr6x, so they’d rather go with gddr7 and smaller bus width. Gddr7 is a beast of an upgrade and benefits the intended use of the product while the reduced memory hinders the performance of other uses for which nvidia has enormously more expensive products.
>>
>>101721181
Nvidia should unironically be forced to license cuda to the competition.
>>
>>101721181
>128 bit bus
That seems too much, you gamers don't need more than 96
t. Jensen
>>
>>101725309
Or Khronos group could make a modern OpenCL successor with support for MP shit, like Vulkan is a successor to OpenGL
>>
>>101725352
>MP
ML*
>>
>>101721181
Do you, retards, know that VRAM isn't free, do you? If they doubled it then you would be whining the cards now are twice as expensive. Be fucking grateful of what they are giving us and stop whining. You're acting like entitled bitches.
>>
>>101725450
Das rite, Nvidia is just a small company trying to make a buck to stay afloat, they are only worth more than Canada and Germany, they haven't even surpassed the USA yet
>>
File: qmaDx1IKPg9J-_port.jpg (69 KB, 512x512)
69 KB
69 KB JPG
>>101725450
>people fucking believe that
>>
>>101721181
>GPU get faster
>more vram is processed on and off the chip within a time frame with the same vram size
>BUT I NEED MORE VRAM!!
where are those 8k res games?
>>
>>101725468
>worth more than canada and germany
what kind of retards are heree
GDP canada 2.2 trillion
GDP germany 4.2 trillion
Turnover Nvidia 70 billion

if nvidia makes a turnover of 2 trillion a year, let's talk again
>>
>>101725450
>the cards now are twice as expensive.
but they are
>>
>>101725450
not that, but a new 2gb module (there are a few of them on your card) that is installed costs them around 3$ to buy. for 50$ more they could easily double the vram, but then they would attack their own professional market and thus their profit :>
why do that if people have no choice but to buy your crap
>>
>>101722952
It's easy, you just desolder the 16 Gb GDDR6X chips and solder on 32 Gb GDDR6X chips.
>>
>>101721562
AMD could do AI since the 6000 series but only for Linux.
Luckily the OS filtering means I could do AI for less money.
>>
File: 3060btfo4060.png (263 KB, 290x398)
263 KB
263 KB PNG
>>101725595
>gpu gets faster
lmao
>>
>>101724099
Obviously it's possible, you retard, but the performance is inferior.
>>
>>101725726
Where would you put them? The pcb is already fully populated on both sides.
>>
>>101725783
anon your cherrypicked sample doesnt change the fact
>>
>>101721181
What the fuck is their problem?
>>
File: 3060btfo4060framegen.png (260 KB, 304x388)
260 KB
260 KB PNG
>>101725886
>bro just turn on framegen dood
lmao
>>
>>101722193
>muh vidya
fuck off
>>
>>101721562
You know that 5060 intended market doesn't give a hoot about AI meme and 8GiB is more than sufficient for their use case (1080p/1440p).
Newsflash, 8GiB or more is only really needed for 4K and fancy texture packs in select titles. This will remain the case until next PlayStation-Xbox. Consoles control the baseline for developers.
>>
>>101722234
The intended thing of jewmaxxing
>>
File: IMG_2222.jpg (54 KB, 750x1000)
54 KB
54 KB JPG
>>101721181
man i’m still rocking my ASUS STRIX ROG 1070 and an overclocked intel 4790k on a Z97. You fucking conzoomer retards are the reason everything is fucking shit - why bother making shit that actually fucking works AND lasts when you’re all going to slurp up whatever steaming pile of horseshit is served to you
>>
>>101721181
is SLi still a thing?
>>
>>101722786
Nvidia wants to protect their AI focused line-up and gaymes don't care that much for VRAM despite the memes. VRAM is only problem with 4K and beyond with some fancy texture packs in selected titles.
The only people who are bitching about "MAH VRAM" are poor-fags AI-types who can't afford Nvidia's intended SKUs.
>>
>>101722234
8GB is still perfectly fine. The software side of things hasn't moved pass it. Anyone whining about wanting more VRAM are likely people who use programs that would benefit from more VRAM, but that's it.
>>
>>101721562
Apple silicon with its unified memory is pretty based.
>>
>>101721181
Unironically, what is stopping Intel or AMD to offer GPUs with 48GB of VRAM for 300 bucks?
>>
>>101722193
AI was the main reason I upgraded in 2022 from my 2015 computer. 2060 12GB. I refuse to ever by another video card with less than 16GB VRAM, maybe even 24GB.
>>
File: 3060btfo40602.png (402 KB, 1909x562)
402 KB
402 KB PNG
>>101725926
>>101725980
>8gb is enough for 1080p
You can't even afford a leather jacket lmao
>>
>>101726052
They'd be selling at a loss and nobody would buy them because consoomers prefer the 5060 8gb since it has the Nvidia name. I mean look at arc, Intel is offering a lot of silicon and vram for cheap, even kosing money on them and people wourk still rather buy a 4060 8gb over any of the arc cards.
>>
>>101725130
Moore's law has nothing to do with memory capacity.
>>
>>101725450
>If they doubled it then you would be whining the cards now are twice as expensive.
Graphics cards have basically doubled in price since 2018/2019.
>>
>>101726107
The Arc's are no cheaper; $270 for an 8GB A750 when you can get a 12GB 2060 for $300. $470 for a 16GB A770 when you can get a 16GB 4060 for $450.
>>
>>101726082
>Max details
Retard, the intended market will have their stuff set to auto which would be at medium-high detail settings. Max detail is literately just lossless textures which barely look better than medium-high textures.
>>
>>101726052
No need for 48GB VRAM in gaming. 12GB was a must last gen because Nvidia's new gimmick demands more VRAM than without yet Nvidia didn't give more VRAM to anyone then Nvidia's latests also had a newer gimmick that demands more VRAM and once again Nvidia didn't increase VRAM and in fact cut the VRAM to the xx60 class card. That truly was a fuck you to the gamers. I was going to say it was a fuck you to low end gamers but Nvidia was asking $400 for the 4060 and the even higher for the 3060 due to covid. I can't wait for Nvidia to crash and burn.
>>
>>101726107
I bought a 980ti from ebay for $100+tax because it has the same performance as the arc770, $200 cheaper, and mature drivers with better driver support (and not crashing) for older games
>>
>>101726052
High capacity GDDR chips aren't cheap nor are needed for gayming use cases.
Almost all of the "MAH VRAM" noise is coming from poor-fag AI-types who can't afford Nvidia's intended SKUs for the AI meme. Almost no gamyer is complaining about the lack of VRAM capacity.
>>
>>101726149
Imagine paying $400 for a card that can't even max games in 1080p in 2024 lmao at your life
>>
>>101726052
VRAM isn't that useful for games. Nvidia are still being cheap jews, but you don't really need more than 16gb of VRAM for most games, even in 4k, which is what both Intel and AMD offer for a fair price. VRAM is mostly useful for stuff like AI, which is very cuda dependent, so Nvidia has a monopoly of it and can gauge prices. If Nvidia was forced to license cuda, suddenly GPUs would start shipping with twice or thrice as much VRAM for the same price.
>>
>>101721562
Go outside and have sex.
>>
>>101725596
>Turnover
Wtf that's not how you spell revenue
>>
>>101726116
>memory capacity has nothing to do with transistor density
>>
I get the AI chat, but 8GB is retarded even for gaming.
>>
File: novideo truck.png (2.09 MB, 1024x1024)
2.09 MB
2.09 MB PNG
>>101722837
>>
>>101726260
That is correct, yes.
>>
>>101725980
>>101726178
This. H100s aren't even that expensive. Around 25k. You can get an A100 for even less. How fucking poor you have to be that you cannot save 25k? What is it about AI that is so attractive to poorfags and pajets? RTX cards are toys for gaming. 8GB of VRAM is a bit too little, but 12GB would be more than enough.
>>
>>101726292
extremely obvious samefagging, you should change your writing or punctuation style when you reply to your own posts to make it harder to spot
>in b4 inspect element screenshot reply
>>
im on 3060 Ti what card do i get now?
>>
>>101726327
You will notice you didn't call me a liar.
>>
>>101726340
An used 3090 is still by far the best value you can get.
>>
>>101726347
>he doesn't even deny that he's samefagging
I respect the honesty
>>
>>101721873
nvidia users are unironically like those annoying applefags of the 00s who talk about they're totally gonna do something "creative" now that they have a mac, but instead it's how they're totally gonna learn AI with their overpriced gimped gpu
>>
>>101726191
Blame the smaller whales who are happy to pay such prices. Only utlra poor-fags having a hissy fit over it.
The days of $199 on discrete SKUs are gone and never will come back
>>
>>101726434
Setting up SD is trivial
>>
File: 1722200079881213.png (185 KB, 399x394)
185 KB
185 KB PNG
>>101721181
>5060
I'm not a yuropoor or poorfag, I won't be buying this version
>>
>>101725596
>doesn't understand market cap
/g/tard moment
>>
>>101726458
genuine poorfags never bought nvidia in the first place
>>
File: 1707787119694932.jpg (279 KB, 1333x1864)
279 KB
279 KB JPG
>>101721562
USGOV steps in and partially licenses Nvidia patents to other chip makers for 1% of what Nvidia could have negotiated if they weren't so hebrew, for national security "we gotta beat the chicoms" reasons. There is no good reason VRAM can't be removable like RAM. If government attacked patent holders who hold back, maybe we could have something nice for a change...
>>
>>101725130
That has nothing to do with moore's law, it's just retardation.
>>
>>101725271
Okay zoomie
>>
>>101726170
>980ti
It's the last GPU that supports XP natively. Best card for retro rigs. Once they're gone, they're gone.
>>
>>101721527
Name a single game worth playing that requires 16gb+ VRAM
>>
>>101721527
i just play old games ( made before 2020 ). apparently this is all i need.
>>
>>101727231
>game
NGMI
>>
>>101726137
Yes but the 4060 uses a gimped 128 bit memory bus where the a770 uses the standard 256 bit
>>
>>101725450
Retard logic that doesnt understand the cost of VRAM. Its only ~$30 for 8GB VRAM. Not $300 extra. Dumbass.
>>
>>101727231
Skyrim with sex mods
>>
>>101721502
i only buy intel. intel processors. intel gpus
>>
>>101725450
I bet you also buy Apple products lmao
>>
>>101727231
I mean, I'm sure cyberpunk modded out and a bunch of other games modded out + upcoming games easily could.

BUT, gpu's aren't just for gaming fag
>>
Just get an AMD? Cheaper for same performance with way more VRAM. Can't do AI but who gives a shit
>>
why don't they just make cards that let you socket in how much vram you want?
worked for motherboards.
>>
>>101727497
You can do AI slop, it's just slower at making the images
>>
>>101727581
Because that requires them being better companies and hurts stock holders
>>
>>101727059
The have to let investors finish the great NVDA pump and dump first.
>>
8GB of VRAM ought to be enough for everybody
>>
>>101726827
He's comparing GDP with revenue, which doesn't make sense, but it's still a lot less retarded than comparing GDP with market cap.
For example, my country has a company (Novo Nordisk) with a market cap higher than the GDP.
>>
>>101727231
If you need them named then you don't even game



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.