I tried to take down ChatGPT by frying its circuits. Why dien't it work? Shouldn't it have been stuck in an infinite loop? I've seen thousands of AI movies and it always worked.
The adversarial part is where this catch occurs
Ask it to show you a unicode for seahorse
>>107151786why do thirdies use chatgpt for shit like this
>>107151786>I've seen thousands of AI moviesName 5
>>107151786it might work on a AI but not on a parrot.
>>107151786>Shouldn't it have been stuck in an infinite loop?Why would it be stuck in an infinite loop? Can you explain what mechanistic aspect of a decoder transformer model would force it to get stuck?
ban boomers who think LLMs are anything more than glorified calculators
>>107151786Youre talking to a token predictor.
>>107152508llms are not calculators even if they can do a bit of arithmetic at millions or billions of times the operation count a normal calculator requires
>>107152720you are a token predictor. why do the reductionist retards never take the final step in their deduction and look at how the human brain is just another physical computation device that functions in a similar way as well?
>>107152731>whyBecause while they're indeed retarded, at least they're still smarter than you.
>>107151826ask it in german and the instance will crash its a current example of a training data poisoning attack
>>107152731Its not a reductionist argument. You're literally talking to a token predictor - a language model. LLMs can be components of larger architecures, or used as control centers for agentic architecures, or constantly fed back their own output before feeding you a response to simulate reasoning, but they're still token predictors. Its not going to choke on your logic problem because its not actually running computations within the model itself to determine a solution. Its predicting response text based on its training and hyperparameters. Best case scenario would be for it to actually generate and execute a program based on the OP prompt, but theres likely compute guardrails in place to stop the average joe from raking up a billion dollar cloud bill + OP didnt actually ask for validation through tool execution in their prompt. This stuff isnt magic. Its layers of abstraction presenting as emergent phenomenae which can still get nut checked by anyone who knows the architecture + paid a bit of attention during their theory courses.