its overhyped af. you literally just talk to an AI on telegram and it charges you 9999999999 for the good models. like sure its cool that it can use your mac mini or whatever but you have to spoonfeed it for every. single. thing
i don't understand why it's only for such a specific machine and i won't ask chatgpt
>>108735886its not retarded normalgroids just fell for the apple marketing ig
>>108735886>>108735912You can run openclaw on almost any machine.But apples recent machines have a related use case: you can run local models on them, which means you don't have to pay for requests (besides thr electricity it uses). Now sure, you can run such models on a lot of stuff, but the memory subsystem of the macs means that they've a far higher datarate than similarly priced machines and relatively high memory capacity too. So the idea is you pay a few grand for sma decent mac and then you only pay for power and very rarely to augment your local model with paid frontier models for highly complex tasks.And for the type of shit people use openclaw for, gemma, qwen and co. can be sufficient.