[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/g/ - Technology


Thread archived.
You cannot reply anymore.


[Advertise on 4chan]


File: OMB-Image-1-Datacenter.jpg (181 KB, 1080x720)
181 KB
181 KB JPG
How many years until all these data centers need to replace their hardware and there are FUCKTON of used components being sold for peanuts?
>>
>>108523082
you mean returned to NVIDIA and destroyed.. i mean, "recycled"
>>
File: duke nukem max headroom.jpg (1.32 MB, 4000x3054)
1.32 MB
1.32 MB JPG
never, they're gonna destroy it before we get to it.
>>
>>108523082
The hardware on order is already obsolete before the building can be built. So they'll be replaced fast, they're doing like 1 gen every 12 months? But you won't get any, they're getting 'recycled'.
>>
>>108523082
>>108523370
Current models give the old H100 lot more bang for their buck. Lot more tokens per compute than they were when they first originally came out. The older H100 are are valuable today than they were when they were released.

Dont count on them being replaced for another 10 years
>>
nvidia probably buy them back just to prevent market crashes
>>
>>108523383
the cost per token on every new gpu generation drop by about 5x. it's financial suicide not to upgrade your gpu.

even if you get price-gouged by nvidia, it ends up being worth it on the long run because electricity costs dominates.

on the other end, this faster replacement of hardware has actually drastically reduced the price of older gens. you can now rent h100 for about $2/hr, which was unthinkable a few years ago.
>>
>>108523082
Every 2 years or so but those ai cards can't be used for anything else. What do 99 % of people do with kilowatt scale cards without a monitor adapter lol.
>>
>>108524275
there are a number of enthusiasts already using old data center cards at home for local inference, I personally wouldn't want something that power hungry and loud at home, but I get it.
>>
>>108524242
the gpus dont grow on gpu trees in the factory backyards of the execs who want those GPUs
>>
>>108524242
It's actually cheaper to just leave existing hardware running than it is to get rid of it and replace it, even if newer hardware has multiplied in performance.
That's what I take from being in a Google data center where even Pascel era GPUs are still floating around. Even for general compute there are hoards of Skylake servers.
Of course very few Google DCs are short on room or electricity.
>>
>>108523082
most if not all of the server hardware will be impractical for personal use
>>
>>108524314
yeah for general applications, i get that it makes sense to keep older gpus, especially if you're doing video encoding or rendering where the chips haven't really gotten better.
but i assume training frontier models requires thousands of the latest gpus.



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.