I LOVE MY LIFE
>>108791596I hate it. I don't want to pick.
>>108791596Which way prompters?
>>108791596from extensive experimentation this seems to happen when chat thinks that there's two possible answers
>>108791596Will Google pay me for testing their model?
>>108793658freewillfags btfo
>>108793658Ask another AI to pick the most correct answer
>>108791596I always pick the left one as I can't be bothered to read both. If I had that much of an attention span, I wouldn't ask AI
>>108793666I always pick the option that has the most broken code.
>>108791596>compiler generates 2 exe files>choose a better one
>>108791596
You *will* A/B test the models and you *will* like it
>>108796193I like doing it, it makes me feel like I have political power.(Assuming this isn't actually being used _against_ me)