>AI datacenters to be built in 2025 will suffer $40 billion of annual depreciation, while generating somewhere between $15 and $20 billion of revenuehow does this work as a business model? does this mean in the next couple of years you can expect to buy compute clusters for absurdly low prices? also how feasible is it to try and use one in a home lab?
First to AGI wins.
>>106443535host deepseek obviously, but they'd rather let it rot in landfill than allow anon to selfhost
>>106443535nvidia will brick them remotely
what else would the trillion dollar companies spend their money onthey couldnt care less
>>106443535Depreciation only means shit if you actually intend to sell the hardware.Making a business case based on depreciation would see that hardware is offloaded within a year, which is impractical.No, the worry about deprecation is coming from investors wanting to pull out and every day the hardware ages is less money for them when it's time to come around to cut the companies up.
>>106443535>compute clusters for absurdly low pricesThey'll still take 5+kW to run.
>>106443556Do they really though? I don't believe that AGI could stay proprietary.
>>106443535they're taking a bet on AI suddenly becoming useful. they don't want to miss a possible dot com boom, so they're investing a ton in AI, just in case it becomes the next big thing. they're ok with taking a huge loss because of the slight chance of a much larger gain.
>>106443556explain how an llm will become agi
your iq is too low to understanddepreciation is a way to calculate a loss to offset profits in a given year, theyll never sell these GPUS, in actuality and run them till the wheels roll off
>>106444402The problem is there are no profits.Only loss.Adding depreciation just means you lost even more on paper
>>106444402what will they run them on? nvidia's business model is to lock important features behind new models to keep you upgrading. why not do that to businesses?
>>106443535well it works for Nvidiathat's why they won't slow down
>>106444432profit? a lot of companies that buy these GPUS also own equity in NVDIA, its a circular structure where they can pump their bags legally :)>>106444448because they co-operate between themselves, if you even look it up NVDIA offers loans to companies to buy their GPUS so they pump their bags, but these companies also own shares in NVIDIA, its circular
>>106444392Trust the science chud.Ai will replace and kill everybody, luddite.
>>106444432Anon you can write deprecation off tax. If anything it's indirect profit.
>>106444644If you're not profiting at all, then there is nothing to tax anyway.That's the case for 95%+ of "AI" companiesDepreciation only becomes relevant in that situation if your looking to sell it off.
>>106444402Depreciation is still linked to either usage or useful life. Even if it's not an immediate "true" cash expense like you'd have if you paid the janny a wage, you'll still have to replace the equipment once it has been exhausted. You can still think of depreciation as "equipment being used up that will have to be replaced in the future if we want to continue operations.">>106444644You can write any business expense off in taxes. Being able to write off $210k of $1,000,000 you spent on equipment doesn't justify buying that equipment.
Is the annual depreciation just the drop in value for GPUs ? That's not really a problem. As long as you can pay back the investment, it's basic accounting.
>>106443535I'm unironically salivating thinking of the massive AI bubble burst and millions of used h100s flooding the market. I'm setting aside used car levels of money to grab a few and run a fully local tts, vision/language, and image gen server.
>>106443535There are way better uses of money than slowly killing ecosystems or making areas worse for the populations who live there, just to make room for soon to be obsolete data centers lmao
>>106443535Jewish trickery
>>106443535>how does this workdo you know what a ponzi scheme is?
>>106443535>how does this work as a business model? it's kind of like food delivery apps and non major streaming services where they're always losing money but investors keep piling money into it
>>106443556agi is 1000 times more plausible than quantum computing and it's still very unlikely
The economy isn't real, it's all bullshit
>>106446978>better uses for money Such as what?
>>106447052giving it directly to me
>>106443535"angel investors" aka the same VC scam that's been running since Nortel
>>106443556Fpbp.
>>106443556This, but what none of the techbros or their investors understand is that AGI is impossible with LLMs.Okay, maybe if you put every single ounce of human resources into it, we could create AM with thousands of miles of winding server banks under the crust of the earth. But even then, I don't think it's likely.
>>106443535AGI? This is just the latest and greatest spyware funded by mossad/cia slush funds. You didn’t think that the central planning committee was your friend did you? Also the next gen of chips designed for these purposes are going to be much cheaper cards and energy efficient You’re going to see costs plummet in the next 15 years
>>106443535the US is on track to default on it's debt, pull back social security, or hyper inflate the USD in fairly short order. As a part of money creation it's investment in new tech with the hope of creating enough productivity to "grow" out of our debt crisis. I see two sort of main motivations for the attitude towards AI (and agi specifically)1) We basically have no hope without some insane growth in productivity. We tried immigration but it totally destroyed our social fabric and has made most shit and people significantly less productive. If we do not some AGI miracle nothing else will work so we need to put everything into this(this is the optimistic view)2) We are doomed, there is no saving the global economy. We should sell people on the delusion of AGI so we can accrue as much of the new money being created as possible to ourselves so that when things come tumbling down we are in a place to leverage our finances and power and keep ourselves safe or restore order more generally. (I think this is peter thiel's view and there is more antics going on)I don't think anyone seriously thinks AGI is on the table, but delusionally acting like it is the only way to keep the markets going and siphon off a bunch of value. These people in ai companies seem to be really fucking stupid so maybe they think they are about to reach it, but I doubt they really care. It's just a way to try to put off the debt crisis and whatever will come from that, since basically no matter what it will devastate the global economy. That's why the datacenters are like this and there's so much stupid money being thrown at it. AI in the long run will have a very significant effect but it will be long after we get out of whatever we are about to go through. I think it's mostly just a scam currently for those people to suck in the new money being created.
>>106448840Pretty much. Feels like the last big money siphon by governments before the inevitable collapse at this point, the next 20 years are going to be fucking ROUGH.
>>106446995oh like pre-revenue startups where people don't see them making money, but they know they will surely in the future so they just keep getting investments?
>>106443718that's plain false, as soon as nvidia releases the gb200 successor, they're gonna scrap all of their hardware for the newest, in shear fear of missing out against competitors.facebook was building one of the largest datacenter in the us, until chatgpt release and they decided to scrap the entire building $70mil into construction to plan for an HPC instead.
>>106446978based, fagman will negotiate on a county basis against corrupt politicians so they can send the utility cost down to the citizens living there. that's also why you see a lot of hpc built in the fucking desert. 1GW of electricity consumed, requires 1GW of cooling, the most effective method being literally evaporating precious water. the fact that they're allowed to build those data centers in the desert is plain wrong.even most power plants have to be built alongside a water stream, like a river so they don't steal people's drinking water
>>106447002We already have quantum computing and it just has specific use cases. Using quantum computing to do regular calculations is not only a waste, it's actively not good at that kind of use case.
so if depreciation isn't really a factor if they won't sell, what's the best value setup for a homelab HPC then? i currently only have the E810 nic, S5212F switch, and some surplus SAS drives on my list of stuff to get for dirt cheap while saving a lot of cash, but everything else, be it the board, rackunit, and compute modules are all up in the air atm. or should i just build a minipc cluster?i wanna set something up as a renderfarm for cgi stuff, mainly vfx in houdini, so anything that'll boost that will be ideal.
>>106448840>the US is on track to default on it's debt, pull back social security, or hyper inflate the USD in fairly short order. Ah yes, something that retards like yourself have parroted for literally decades now.
at some point these things are just gonna be 90% die, 10% board.
>>106450947there are no good performance/value setup possible right now. If you wanted to run any good-sized model (in the ~100 GB vram range) you'd have to get your hands on amd's infinity fabric, or nvidia's nvlink. this is already really difficult to pull of for a regular user.but given your use case, it's not really a hpc you need, it'll mostly depend on the rendering software you use. some software supports cuda (vray, blender, twinmotion, maya, c4d etc..), but they have very little support for rocm. your easiest choice is to get a good cpu (you can get kinda cheap second-hand epyc/xeons right now). honestly, if your software supports cuda, then adding a good gpu to your computer will likely already be a very good speedup over doing it on the cpu, it'll also save the headache of installing the rendering software on a server and moving files between computers
>>106450999the nvl72 is advertised as 'linear scaling', that means the whole cluster can act as a single GPU. the only limiting factor is PHYSICAL latency driven by cable length between the gb200 dies, that's why that stuff's gotta be water cooled, also 200kw+ TDP, predicted to increase on the nvl72's successor
>>106451047that's what i figured. i was thinking maybe a threadripper setup might be more cost effective than an epyc processor since i don't really enterprise grade reliability, just a lot of compute power and pci lanes for GPUs and storage, but it seems like motherboards for them are absurdly expensive. the tradeoffs seem to be unavoidable. anything that's scalable while still pci gen 5 is killing me from a pricepoint perspective. i would have expected there to be way more options.
>>106451085to add to that i was looking into gigabyte's altra arm dual socket boards, but idk shit about how reliable they are
>>106451085you can use pci gen 4 with a gen 5 gpu, it'll be slower for transferring stuff, but if your workload revolves more on having the gpu do stuff then it's not really a bottleneck.also, threadripper are bad in terms of price/performance, you're better of getting a ryzen 7 16 core at a reasonable tdp/frequency. that said if your software has cuda support, then adding a $500 gpu and actually using cuda is a MASSIVE speedup.
>>106451121yeah it's more about gpu density. ideally i'd like to build something that can hold up to 8 rtx cards, but the sheer power draw would probably fuck up any normal board and psu setup if i'm not cautious.
>>106451136if you want to do that, then no version of pcie will help. cards need to communicate and the pcie bandwith is good enough for up to 2 cards, then your only option (and the better one) is to use nvlink
>>106451147damn, isn't nvlink absurdly expensive to utilize though? as in most of the cards that support it are enterprise grade and are in the 5 figures range? i'd rather avoid that but it makes sense if they have to avoid large memory lookup latencies
>>106444509The science is what says it just can't, lol/\.
>>106446959If this actually does happen, they wouldn't even be that expensive.
>>106443535The bulk data collection means it's all worth it. They spy on everyone and keep a complete record of everything a person has ever done in their life.
>>106450955That's a fair criticism but the birth rates and impact of immigration have massively shifted over the past 2 decades, as I said Peter thiel and those types talk about this openly and have call back stuff set up for when it goes down. Regardless of whether it happens it's still what's motivating the investment as I said.
>>106451166it's expensive, but realistically most software won't support it out of the box. even commercial ai inference software tend not to use if effectively
>>106447002>>agi is 1000 times more plausible than quantum computing and it's still very unlikelyQuantum computing is already a reality, but we have never even seen an inkling of actual AI
>>106447002
>sink more debt into wasteful ventures>USD collapses>US gov falls, states balkanize>???>Network States cyberpunk techbillionaire feudal dystopia
>>106443535>20 to 25 billion in accounting losses carried forward to offset against any taxes forever
>>106453426These companies aren't paying taxes thoThey are basically all operating at a loss with VC cash propping them up.Depreciation is just more loss in this situation.
>>106443535>does this mean in the next couple of years you can expect to buy compute clusters for absurdly low prices? also how feasible is it to try and use one in a home lab?Nah. Given the deprecation so far, and how even shit tier compute card from 5+ years ago have held value even as blackwell launched. Scarcity and energy concerns are the biggies.These companies can offload and liquidate at high prices because the market still demands it. Most people will settle for an ada generation card because they're still sucking off hobos in alleyways for old volta cards.The other week I'd seen that even humble turing cards from 2018/19 still held value. Because there's always someone down the pecking order willing to buy a card for 11 to 16GB of VRAM, having a card that has 40 or 80GB just sets you an entire tier above. The beggars trying to get compute over 3x A770 aren't equivalent to the person with 1x 48gb card. The one with the 48 gb card and the capacity to hose 3 cards? He has 144GB of VRAM in his pool to work with.So long as there are hard limitations lanes, DDR dimms and speeds and those stay at the market value they are, old cards stay there.The issue is for the datacenters, the big AI companies spending billions, trillions on every single gpu they can get.Seriously, want to see how bad it is? Look at ampere, volta, look at how the 3090 cannot and does not fall under that price. It's all balancing on what the current lowest price for a large unit of memory are, it's skewed and fucked.>>106446978It's a fiat currency you fucking imbecile. It's not real. 1 second after I make this post the collective hope could shatter and everything is worthless. There's no logic to it, so there's no "better uses of money". There's a better allocation of resources to be argued yes, but not money.
>>106453511>because the market still demands itWhat's going to happen is when a few AI startups fail, alot of them will go.Some bigger players may snatch up GPUs but considering that over 90% of AI companies are basically unsuccessful. There will be a flood and prices will crater much beyond depreciation worries
>>106453426Its not forever.It can be edged into 2 years + some marginal accounting extensions.
>>106443535It does not.The entire "business" is all about smoke and mirror tricks that only work on out of touch "investors" and glue-eating retards like 106443556. Saltmans and Musks just care about selling product, not if it's good. "AGI", "ASI", "Superintelligence" etc are just meme words, it's no different from faggot fake dietetician trying to sell you "superfood" with "negative calories".
>>106453537Yeah, a big collapse would do it too. So long as the bubble is there and there's demand, restricted by supply then we'll see it continue as is.What do we define is when the AI industry fails. If the smaller startups fail, some cards will make it onto market for the public, some will find their way into other labs, business' etc.If a massive collapse happens, say OpenAI, meta, X, anthropic etc all pull out then maybe that massive liquidation could happen but that relies on too many variables and factors down to the way investors see these companies or profitability.There's other things tied to it like GDP & business productivity. While altman preached for an AI revolution setting workers free, a lot of the bullshit business' that already extract wealth may cut back workers as agents and tools increase in capability. Automation is the big promise that's doing a lot of that heavy lifting in keeping it afloat. Whether cards will arrive in everyday peoples hands? That depends on how those failures look and the current market value of memory which is currently what sets the minimum prices.Some dream of scoring a bunch of nice 48GB or 40GB cards for a couple hunno, but it's going to depend on how much finance can suck up of these cards too. Who the middle men are and the effort required in redistributing them.I pray for big AI collapse so I can slam in 3x A100 for 240GB VRAM. At 200 a pop or A770 tier pricing, in current market terms that equates to about what it costs for a single 3090, xtx or B60 pro. The price of something like a P40 is about half that but's balanced off of power, memory capacity versus memory speed & compute. It's insane it's retained so much value but it's balanced off the bottom end of the market.The whole thing needs to go kaput with finance for them to wake their way into our circles imo.
>>106443535>does this mean in the next couple of years you can expect to buy compute clusters for absurdly low prices?no lol because nvidia has buyback agreements with compute providers to make sure that Extremely Dangerous Hardware does not fall into the hands of Malicious Actors (people who would actually use it for useful shit)
>>106450870Google’s total water usage is less than 1% of what’s used for alfalfa, which is a crop made purely to feed cowsmuh datacenters need too much water is retarded
>>106443535Basically, once they get skynet online profits won't matter.
>>106443535They are getting trillions in capital so making money doesn't matter
>>106444356It is pump and dumping with funny money. The economy is so disconnected that is practically a parody at this point. Nobody has the balls to call out the bluff and have all corrected (it will be a bloodbath)
>>106453859>Brainlet can't get into ecosystems and has no idea how viable fresh water is especially in an arid biome
>>106443556It is the technophile version of the rapture.