why does it keep getting dumber?I asked it a simple physics manometry questionit solved it correctly but with one term having the negative sign instead of the positiveI sent it the official answer which was incorrect, it gave up on its answer and started arguing the given wrong answer is correctthen I sent the correct answer and it started arguing how it's first answer was correctMeanwhile Gemini got the answer correct in one go. I sent it the wrong answer from the book and it said the official solution was wrong.
LLMs can't into maths fa.m
The more data they feed it the more conflicting the results and its internal parsing becomes.Likely at some point they stop feeding it curated and checked data and started training it on scrapes of whatever they could get their hands on and the results started to get more erratic and hallucinatory.With visual stuff this is easy to immediately detect as "wrong", but with written information you can't instantly tell if it starts bullshitting halfway in or in small details or in totality.
>>108631064because you're not paying anything for it
It's a language model trained on retarded reddit faggots arguing about who's opinion is more heckin valid. Your answer might be the top result because some dude hates trump.
jeets just want it to code iPhone apps