[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/g/ - Technology


Thread archived.
You cannot reply anymore.


[Advertise on 4chan]


File: transformer.png (110 KB, 1340x759)
110 KB
110 KB PNG
People shit on LLMs for being just word predictors and yet all the other more sophisticated AI architectures like world models or spike neurons net,... produce worse results.
What gives?
>>
it's almost as if 100% of the investment is currently going towards llms
>>
>>108235531
>100% of the investment is currently going towards llms
this was not true when chat gpt came out.
It's true now because transformer produce much better results than everything else
>>
>>108235523
LLMs + scaffolding techniques yields better output
>>
>>108235523
maybe it's because the token predictor engine is running on 20000 times the compute
>>
>>108235523
It's almost as if LLMs are the best we got and it's still nowhere near the complexity and capabilities of a human mind.



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.