If RAM is so expensive and AI needs a lot of it, why won't they just optimize the AI?
>>107626529Fun fact: Most LLMs are made in python.>erm, so what chu-If you would please consult my graph.
Scam altman does not want that because he wants a monopoly. It's not working because his competition is working to optimize while he's trying to brute force while buying all of the supply to cripple them.
>>107626588Source?
>>107626529The only people out there with the know-how to optimize ML to run efficiently on graphics cards are Nvidia engineersNvidia wants to charge fat stacks of money for pitiful amounts of vram
>>107626742sorry I was petting my cat xdhttps://thenewstack.io/which-programming-languages-use-the-least-electricity/There's also this one which seems to corroborate it. Python is consistently at the bottom of these tests in terms of energy and ram efficiency.https://www.sciencedirect.com/science/article/pii/S0167642321000022
>>107626529UOOOOOOOOOOOOH NAKADASHl IN RENGE!!!
>>107626529Let's say you can make an AI that runs in 128GB memory and scores 96% on some benchmark. Now let's say that a bigger model scores 98% but requires 512GB.The sensible and responsible thing to do would be to stick to the much smaller model that's nearly as good. Unfortunately while you're busy doing that, your competitor has just released a model that requires 2TB and scores 99%. Nobody cares about anything other than bigger number = better, so your sensible balanced model immediately becomes irrelevant. Better start working on a 4TB model that scores 99.8%.
>>107626529It's probably cheaper to buy some more ram than to optimize the models.
>>107626529No need to when they have all the money in the world thanks to dumbass investors.
>>107626588Nocoder spotted
>>107626529>optimize the AIplebs can’t be allowed that