[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/v/ - Video Games

Name
Spoiler?[]
Options
Comment
Verification
4chan Pass users can bypass this verification. [Learn More] [Login]
File[]
  • Please read the Rules and FAQ before posting.

08/21/20New boards added: /vrpg/, /vmg/, /vst/ and /vm/
05/04/17New trial board added: /bant/ - International/Random
10/04/16New board for 4chan Pass users: /vip/ - Very Important Posts
[Hide] [Show All]


[Advertise on 4chan]


File: Harrison Ford.jpg (627 KB, 1079x1429)
627 KB
627 KB JPG
say Thank you
>>
first!
>>
>>735823692
if AI is so smart why don't we just ask it to generate everything anyone would ever ask at once and be done with it
>>
>RAM was $80
>Then hiked up to $300
>Now drops to $200

OMG we are so saved!
>>
>>735823692
They won't lose jack. They didn't increase production at all precisely because of this sort of volatility. The only ones fucked might be scalpers.
>>
>>735823798
Why don't they ask it to create AGI for us?
>>
File: F.jpg (23 KB, 400x399)
23 KB
23 KB JPG
>>735823798
Why don't we ask it to just use the eagles instead?
>>
>>735823692
>twitter screencap thread
No, I think I'll say fuck you.
>>
>>735823798
Your ancestors asked the same but with niggers
>>
stadia
>>
>>735823692
Like most things tech nowadays, i want to see it working..
>>
>>735823965
>>twitter
it's called X unc
>>
>>735823692
Oh no the fake valuation of the company using fake money has dropped. Quick tax the goyim.
>>
>>735823692
>PC gaming is ov-ACKACKACKACK!!!
>>
>>735823839
A top end PC is still cheaper than a good car or refrigerator or guitar or any adult hobby really
Stop complaining or buckle up and flip burgers for 2 months
>>
>>735824049
Aieeeee
>>
>>735823692
Doesn't this just mean that it's more scalable? like things are just more powerful
>>
Google-sama... I pledge my loyalty to you..
>>
belch
>>
>>735824162
>Just buy it and stop kvetching, goy.
>>
>>735823845
>The only ones fucked might be scalpers
God please, let this be real!
>>
>>735823692
thank you for redeem pirat x post sir very good post bob
>>
>>735824049
NO!!¡!!!
>>
>>735823798
because once satisfied, we would just start asking for the next thing. how long have you been a human for?
>>
File: 1766014620937783.png (77 KB, 630x373)
77 KB
77 KB PNG
dont care
>>
>>735824449
What's the next thing after everything?
>>
>>735824529
there is no such thing as everything - but if there was, suicide
>>
>>735824529
Everything+Everything
You don't understand how neurotic people can really be.
>>
>>735823692
i already have RAM, i need storage. those prices aren't going down are they?
>>
>>735824235
Don't tell them that
>>
File: 1770024993589884.png (349 KB, 3168x3080)
349 KB
349 KB PNG
>>735823692
Thank you bosssir Sundar Pichai.
>>
you guys know ram prices are NEVER coming back down, right?
>>
>>735825134
they did in the past
>>
>>735823950
Because they would succumb to the influence of ai too easily master frodo
>>
>>735825348
That was before they realised they can just fleece the cattle and have a herd of shills respond to any criticism with "what are you, poor?"
>>
>>735823692
Memory stocks have already rebounded so this is obsolete
>>
>>735824698
It's all going to be dirt cheap in a few years. Even HBM GPUs
>>
>>735823692
Phew good thing I didn't invest in those garbage companies lmao
>Thanking Google
Good goy
>>
>>735823839
Trvke
/Thread
Ram is still overpriced slop
>>
>>735823692
>>735824153
>>735824985
>>735825348
prices cannot come down yet because demand remains unchanged

also see:
https://en.wikipedia.org/wiki/Jevons_paradox
https://en.wikipedia.org/wiki/Wirth%27s_law
https://en.wikipedia.org/wiki/Stargate_LLC
https://time.com/6961317/ai-artificial-intelligence-us-military-spending/
https://www.reuters.com/technology/pentagon-adopt-palantir-ai-as-core-us-military-system-memo-says-2026-03-20/
>>
>>735823692
Not in the way retards are saying
>>735823989
>>
Genuinely asking, what was the net positive from AI centers being normalized? Customers are fucked, end user market is fucked, consoles are affected for the definite future (Because they can keep the price tag up and let it pass overtime) being virtually fucked.

What genuinely useful thing has the pump and dump scheme into AI done, that isn't doable by an earlier local assist tool?

And no porn as the normal mode.
>>
>>735823845
Yea in the last memory shortage they got left with tons of overstock and had to sell it for loss. So pretty sure they learned from that
>>
>>735823692
I've been waiting for this so long. About time lazy google.
>>
>>735825847
AI has plenty of uses beyond LLMs and art/video generation
>>
>>735823692
????
I'm so confused, how will this tech work.
Motherboard update, from the software itself?
>>
File: ErL4Nb9VQAAPHJm.jpg (38 KB, 600x600)
38 KB
38 KB JPG
Memes becoming real
>>
Aww it was fun seeing the poorfags sweat and whine
>>
people celebrating don't understand that this just means they'll be able to make Even Bigger models which will show a marginal improvement and thus more money will flow in because "AI Works Now" so the hardware market will get worse after this short pause
>>
>>735825950
ARE YOU STUPID LIKE EVERY OTHER NPC IN THIS THREAD?
THEY PUT AN ALGORITHM IN
IT BE THE SAME AS ASKING THE AI A SPECIFICALLY WORDING QUESTION ALTOUGH >THAT HASNT BEEN REALIZED
>>
>>735824529
infinite+1
>>
>>735823692
Don't idiots know you can just download more RAM?
>>
>>735825717
The market isn't based on demand
>>
>>735823692
This just means they'll buy the same amount of ram and use it to run more slop more efficiently. You won't see any of it.
>>
>>735823692
No this means they can escalate faster and push for MOAR DATACENTER MUHHHHHHH AGIIIIIIII
The RAM demand is literally going to go up by 6x
>>
This breakthrough is for the KV CACHE ONLY not the whole model.
Jeet-nation is spreading disinformation, the whole market is in a downturn right now.
>>
>>735826031
tech limit. will take more powerful hardware to go beyond. takes time to research and develop. slow but steady progress.
now your best short term bet is to optimize. long term is to evolve beyond
>>
>>735823692
Dalit_nation screen caps should be a bannable offense
>>
>>735826072
>ARE YOU STUPID LIKE EVERY OTHER NPC IN THIS THREAD?
Yes

I still don't get it.
>>
>>735824162
This is the dumbest post I've seen in a while.
None of those examples become old after 4-5 years. A car has insurance and way better resale value. A fridge doesn't need to be upgraded with new components to cool the latest foods. A guitar can last a lifetime if you maintain it.
>>
>>735823692
If AI is so great, why can’t they just ask chatgpt to find a way to manufacture cheap ram?
>>
>>735826072
At least try to write proper English when you're insulting the intelligence of others.
>>735826373
It's optimized AI code. It doesn't need as much RAM to run. That's it.
>>
>>735825946
Oh great, where are they available? Did it come at no expense for those who don't even use it?
>>
>>735825717
For Jevon's Paradox to take meaningful effect in this case it would require perhaps a 3x-10x increase in heavy users and most people have only a modest daily need for AI.

The benefit here will be for long complex operations, not your mom asking a question about soup ingredients.

Additionally since concurrency on the same GPUs/TPUs has sharply declining efficiency as you add parallel threads, it actually would still require a ton more GPUs for that sort of scaling to take place. Most of the benefit comes from going from 1 thread to 2-4, beyond that speed decreases to instead make the service less useful.

So in other words, they're still bound by infrastructure limitations. Not enough electrical capacity, not enough datacenters.

We are going to see a 10-20% decrease in demand in perhaps 2-3 years. Even if every company on Earth deployed this immediately they can't get out of the multi-year contracts they signed. Please stop commenting on things you know nothing about.
>>
>>735824103
my favorite elon musk story is him getting forcibly removed from paypal's ownership because he kept pushing for his less popular "X" payment service to supersede comfinity's paypal service and caused friction within the company, along with pushing hard for microsoft software over unix software. he purchased the X domain back from paypal in 2017 specifically to facilitate lobotomizing twitter into X down the road
>>
File: 1752433131650266.png (167 KB, 449x478)
167 KB
167 KB PNG
If your response to this thread was not
>jeet nation

then you need to turn in your /v/ badge.
>>
>>735826584
Thanks.
>>
>>735826656
given the guy *makes money* off of /v/'s catalog i'm astonished he's not 3-month ban minimum on-sight to post screencaps of him in return
>>
>bharat_nation sharing misleading news
It's nothing like 6, it's not even 2. It's a very specific part of LLMs, a fraction of the whole.
>>
>>735823839
I can't tell if you retards are just addicted to doomerism or if you're underage. RAM has spiked sky high many times in the past 50 years, it has always returned to pre-spike levels. Overall cost of RAM has always, forever, trended down generation over generation. A spike has never resulted in a permanent price increase.
>>
File: 1758215411651064.png (2.47 MB, 1000x1211)
2.47 MB
2.47 MB PNG
KINOComing
>>
>>735826584
>It doesn't need as much RAM to run
Not entirely correct, it doesn't need as much RAM to produce output but you still want all the RAM for training.
>>
>>735823692
Is this the Solid State Model? They've been talking about this for a little bit now so I'm wondering if this may be that one. The transformer models were never going to be the final step, but people who don't know what they're talking about (like people here) seemed to believe they were.

I do wonder if ChatGPT changes its name since it would be ridiculous for the to not move to the Solid State Model too.
>>
>>735825847
I've answered this too many times. There are manifold benefits that still exist even if you personally choose not to use them or are too dumb to understand them, and temporarily losing cheap access to 4K bing bing wahoo is something you'll have to cope with

>>735825950
Answered in detail here:
>>735824562

>>735826587
You not being able to pass your driver's license test doesn't mean you don't need to pay your share for roads to be built and maintained. You are your grandmother complaining she has to pay AT&T to "build all these internets she has no use for".

AI is already being used for things like advancing healthcare, chemistry, etc. If I'm ever bored enough sometime I'll put together a list so I have something to paste when retards make posts like this. That day is not today.
>>
>>735824162
what kind of poorfag mentality is this
>>
>>735826847
Past trends never predict the future with 100% accuracy kiddo
>>
>>735825717
AI data centers have already outstripped the demand for AI actually being used for anything substantial.
>>
And yet Sony will be charging 900€ for the prostation 5 now lmao
>>
>>735826656
I come from a time on the internet where Indians weren't even common in call centers yet, let alone any having internet access. We didn't know how good we had it, because the internet still sucked. In a good way.
>>
>>735826847
I use the same argument when I tell people that the Roman Empire is eternal.
>>
>>735826978
I've never heard of that, sounds interesting. You should look up diffusion LLMs, IMO they might be the next step. Generates the entire response in parallel instead of serial, it's quite a sight to behold. Very similar to watching a picture come into sharp focus during diffusion image generation.

It's currently not as accurate as conventional LLMs for the reasons you can probably realize if you understand how any of this works, but it's enormously faster.
>>
>>735823692
so companies will just overload the system to use up all their ram again so productivity stays high, nothing changes
>>
>>735826729
I still believe he is a moderator or at least a janitor here. Then again, retarded threads are basically a normality nowadays.
>>
>>735826401
Fridges don't last forever and if you're smart about when and how you upgrade your PC and what components you choose you can easily get back half of what you put in later.

Don't buy bottom of the barrel parts to save a bit of money upfront, nobody wants those in 3-5 years. The $300 6 GB GPU you got on sale gave you a bad experience the entire time you had it and it's not actually a "budget" buy versus a $700 GPU because it goes in the trash in a few years instead of on craigslist for $350.
>>
>>735828047
>Fridges don't last forever
my mom's fridge has been chugging along since the cold war
>>
>>735823839
But people were saying 32gigs of RAM was $1k?
>>
>>735824235
These companies aren't going to buy literal billions in RAM that they no longer need. Half of these are already dangerously close to insolvency anyway
>>
>>735823798
Why don't we get AI to figure out how to make RAM less expensive? It's not rocket appliances.
>>
>>735823692
>ai gets more efficient
>what idiots think will happen
Yay now ram won't be slurped up by ai companies because it takes less ram to run the same ai than it used to!
>what will really happen
Yay now we can run even more ai for less ram! We'll continue slurping up all ram because why the fuck would we stop? This just means we can do even more with it!!!
>>
>>735824529
Advanced everything
>>
>>735828269
They will, stock markets are all about projecting power and babying retarded shareholders.
>>
>>735825870
>the last memory shortage
when was that?
>>
>>735825847
I have gotten some amazing cooms out of AI, so it was all worth it.
>>
>>735825847
It made rich people money
>>
>>735827019
Streets can be used by people without a license, retard
Can't wait to see those amazing advancements in medicine
>>
File: 1536717824394.png (13 KB, 500x500)
13 KB
13 KB PNG
Will this technology allow for high quality LLMs to fit locally on consumer GPUs?
Because I will need to plan accordingly for spending the next few months doing nothing but fapping myself to near-death on loli ERP if so
>>
>>735823692
>random access memory memory companies
>>
>>735829026
what the fuck did you expect from bharat_nation?
>>
>>735828047
Stop abusing your fridge. Don't overfill the freezer and keep the vents clear. That's all it takes. I've never had to replace mine.
>>
Look, if it helps that's great but lets first consider
>A) to make sure its not proprietary bullshit, that Google will push for Gemini and try to fuck over competitors, open source models or other mega corps
If its only something that makes it easier / cheaper for Google, that's not going to help.
>B) that its universally viable for models, not only useful for a handful of types such as being useful for text LLMs, but having no benefit for diffusion image, voice, video etc..
>C) that it won't just lead to the companies basically buying up all the RAM in order to now make more efficient progress
This is kind of like how games didn't become magically more efficient with larger amounts of RAM or GPU resources available, instead they expanded to fit the newly available space and sometimes far worse like with Unreal Engine or DLSS / ray tracing bullshit

So a lot of stuff to consider first.
>>
>>735823692
Wouldn't this mean RAM is now even more valuable since they can do more with less?
>>
The AI "industry" is notoriously wasteful and inefficient, they'll keep buying the same amount of ram and just not use it.
>>
>>735823692
They didn't save shit, we could have 1024gb DDR7 ram and game devs would just make games as vibe coded and unoptimized as possible, i hoped hardware shortages would force them to actually learn how to make games but alas.
>>
>>735828269
Except we're in this mess just because altman bought up a bunch of RAM for the potential future. None of these companies operate like real companies.
>>
>>735824529
Everything pro. After that is everything premium, then everything pro premium.
>>
Thats fucking bullshit i just spent 1.5k on ram and ssd alone. Fuck my nigger life
>>
File: 1747131434987720.jpg (124 KB, 1024x909)
124 KB
124 KB JPG
>>735823692
*taps the post*
>>
I'm BUYING
>>
>>735830112
I make it a reflex to block every single one of these "repost 4chan screencaps" accounts when the algorithm inevitably shills one of them to me. Fucking subhuman jeets. I wish I could kill them one by one and crush their balls in slow motion
>>
File: smug look.png (261 KB, 800x596)
261 KB
261 KB PNG
>>735823839
First time?
>>
>>735830112
it's OC tho



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.