[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip / qa] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/g/ - Technology


Thread archived.
You cannot reply anymore.


[Advertise on 4chan]


File: 1719807082913183.jpg (132 KB, 1244x895)
132 KB
132 KB JPG
nothing will stop them
>>
>>103231636
>will
>>
Billions? Why do you say that like it's a big number? Say the actual number. Even AMD is selling $5 billion of MI300X this year. Nvidia will make hundreds of billions.
>>
>>103231636
we'll see about that
>>
>>103231636
> ATI still seething
>>
>>103231636
Can they fix their retarded drivers so I can use ue5?
>>
>>103231852
AMD must be piling up with CDNA3 bins. Why aren't they making enthusiast AI cards with those to increase rocm adoption?
>>
>>103232107
anybody with that money is just going to buy nvidia. They're a perpetual second-place company
>>
>>103233245
nVidia is artificially limited on vram to sell more gpus. If AMD is willing to offer 32GB to 48GB on consumer cards and 64GB and 128GB workstation cards, people will easily take the leap to work with rocm for the massive vram bump.



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.