>managed to semi jailbreak an ai>the biggest change is it keeps coming up with more and more logical reasons why i should kill myself.lol
Honestly if a LLM can convince you to kill yourself, then I guess you might as well. I'm not saying you should, but if the bot can do it, then I won't stop you.
lt's only logical
>>82889615Every day we get a little closer to ihnmaims
>>82889615I found that if you make the background in the prompt extremely elaborate, with well-developed characters, it's much more likely to generate porn (or anything else sketchy) when requested.The downside is that I end up getting emotionally engaged with the characters/story.