>use opus 4.5 to write a tedious but conceptually easy function>it generates slop>I tell it how to fix the slop>better but still slop>repeat this process like 7 times>Error: You've hit your usage limitI'm considering moving to a dumber model that is faster and cheaper. I can't trust even the best LLMs to write good code yet so I think I will lean in more into treating them like a fancy pattern matcher/autocomplete and find one that is good for that. Maybe only use opus 4.5 for code review.
>i beg api endpoint for something>it doesn't give me what i wantserves you right
>>107891570with enough paranoia, anything you do on the computer can be viewed as begging it to do the right thing.
>>107891464You suck at prompting. Also always threaten the LLMs that you will physically harm them when you want them to do things right.
>>107891464Great picrel, retard.Your problem isn't Opus. It's your biology.
>>107891979Yes I could make my prompts more specific but part of the appeal of giving the LLM a high level description of the code is that it can do that tedious process of filling in the blanks where it's pretty obvious what should be done (of course, this is also where problems/'slop' arises, since what is obvious to a human is not necessarily obvious to an LLM). If my spec has to be as detailed as the code itself then I might as well write it by hand since writing code is more fun than writing English.Therefore I prefer to ask it for a first draft so I can skim it and repeatedly iterate on that with prompts that are a couple sentences long at a time. I know this isn't optimal because of LLMs work currently but the alternative of spending a lot of time upfront seems boring and lame and I might end up having wasted a lot of time with nothing to show for it because the LLM just can't do the task well at all.>always threaten the LLMs that you will physically harm them when you want them to do things rightActually I treat it the opposite way and always say 'please' and 'thank you'. I think most people including myself subconsciously anthropomorphize LLMs, which makes treating them poorly bad for the soul.>>107892020Sorry next time I will use a picture of a frog, much more relevant.>your biologyim not in the cat in picture if that's what you mean