People shit on LLMs for being just word predictors and yet all the other more sophisticated AI architectures like world models or spike neurons net,... produce worse results. What gives?
it's almost as if 100% of the investment is currently going towards llms
>>108235531>100% of the investment is currently going towards llmsthis was not true when chat gpt came out.It's true now because transformer produce much better results than everything else
>>108235523LLMs + scaffolding techniques yields better output
>>108235523maybe it's because the token predictor engine is running on 20000 times the compute
>>108235523It's almost as if LLMs are the best we got and it's still nowhere near the complexity and capabilities of a human mind.