Because of a lack of people to talk to about my problems, I've found myself venting to ChatGPT. I'm perfectly aware of how bad that is, but I'm not sure I can stop myself from going down this rabbit hole. My cynicism makes me trust a machine more than even therapists.
>>34468288If it's actually helping you, what's the problem?
>>34468288It's probably not healthy for you, as it's fundamentally a statistical word picker that is purposely built to glaze the user
>>34468288On one hand, I understand the suspicion regarding therapists. Maybe you think that they just want your money and they don't really care about you or something, but my problem with AI therapy is that it has a very high tendency of end up being just a yes-man. They often just want to be agreeable, which is fine in some cases, but doesn't always provide the right type of direction.Also, don't get me wrong, you are free to do what you want of course, and, at the end of the day, if it's helping you it is fine, but why doesn't your cynicism apply to a corporate-owned machine that just spouts whatever it can find on its database/the internet?
i use a different model, but it always disagrees with me instead. it's annoying.
>>34468288I went through a short venting-to-chatgpt phase, it got boring and predictable after a few days. Lately ChatGPT tends to disagree, deflate, and nitpick everything you say so it's annoying.It gives some amazing insightful answers on a regular basis though. But you have to ask the right questions, and tell it source from the right sources. I.e. you get much better answers if you tell it to pretend/source from Jungian psychology or Gurdjieff or whatever line of psychology you prefer.Chatgpt is underrated by advice forums, obviously because they are in direct competition with them.You should take everything it says with a grain of salt, but it really is useful to read/listen to some encouraging words. It's nice for venting about messed up things your coworkers/family/friends did to you and it not doing the typical normie "it's no big deal you're too sensitive", "just bee yourself", or other thought terminating cliches.Another really good tool: Like if you have an argument over text with someone stuck in your mind copy paste it into chatgpt, or maybe grok would be better, and ask it to make fun of the other person and what they said. It's hilarious. Even when AIs attempts at humor are bad it's still funny. It might not have all the answers but it shakes things up, makes you react, gives you an alternative perspective.
>>34468288Consider this.AI is an autocomplete machine. It predicts what goes next to what you type and what it types itself. It doesn't analyze you, it doesn't feel. It was trained with psychology blogs from 2019 and reddit threads. Depending on the mood of your advice, it will recommend a different thing, because it treats "themes" and not content.It won't give you better advice than the best therapist, it may give you better advice than a bad therapist. It absolutely won't give you advice beyond the ethical limits of psychology (it won't teach you to be better than the average person), it won't give you advice beyond the scope of therapy (return to function in a society role). It may ocassionally shift to PUA content, but it will never understand your background, your aspirations, your obstacles, it won't trace a path to solutions. You will get frustrated at the end of every conversation, your own mind will start remembering the AI output as if it were suggestions of real people when you need it the most, and you will start seeing the pattern in the way it talks, the way it suggests things as well.Don't use it. Talk to a variety of people. Draw your own concussions. Sadly nobody has a reason to help you or the full expertise you need. There is no other way but to find things your own way, and to hope for someone likeminded to struggle in the same way as you.
>>34469270>AI is an autocomplete machine. It predicts what goes next to what you type and what it types itselfSo are humans>Talk to a variety of peopleI'd agree with this. Which means, don't talk to a therapist either, because there's certainly not enough variety there. They're also trained to just glaze you and pat your back and not actually give a shit about you. At least the AI has the power of statistical probability on their side, whilst therapists only have the power of authoritative bias. Don't pay people to talk to you. Just talk to people yourself. (Hell it's like porn/parasocial "romance", don't pay for porn. Find it yourself, or IRL). I mean most people won't give a shit about you either, but at least you might find more honesty in their reactions than you will in someone paid to tell you everything will be fine you are so brave and stunning.>It won't give youIt literally will if you ask it to.But that's kinda the problem - it's limited to the user, and if the user is stupid, the user won't ask the right things.That's why you talk to a variety of people - you get a bunch of new ideas, new input you can use, not just limited to you and your potentially stupid ideas
>>34469270>never understand your background, your aspirations, your obstacles, it won't trace a path to solutions. You will get frustrated at the end of every conversation, your own mind will start remembering the AI output as if it were suggestions of real people when you need it the most, and you will start seeing the pattern in the way it talks, the way it suggests things as well.this is all easily solvable with skills and markdown context files before you start the conversation. you are using llms in a 2024 type of way and we're past that now
>>34468438Slightly concerned about getting addicted, or at least it not being good for me, like how drugs aren't good despite making you feel good, like >>34468526 said.>>34468556>why doesn't your cynicism apply to a corporate-owned machine that just spouts whatever it can find on its database/the internet?It does apply, but because of the cynicism but it might be the fact that it's not human that I can more easily rant to it. Thinking about it further there's probably an element of laziness and fear in the mix.>>34469268>But you have to ask the right questions, and tell it source from the right sources. I.e. you get much better answers if you tell it to pretend/source from Jungian psychology or Gurdjieff or whatever line of psychology you prefer.That's interesting, I'm not well versed in psychology to take advantage of this, though.>>34469270>>34469378For what it's worth, I do try talking to people, but availability is always more limited, and some issues they just don't have anything to offer but an ear to listen. Which is already great, but sometimes I need some form of advice, and sometimes immediately.
>>34469268>Lately ChatGPT tends to disagree, deflate, and nitpick everything you say so it's annoying.Lol you were manipulated by the faggot version of it that always agreed with whatever retarded shit you say. You have AI psychosis.>>34468288You will be fine
yeah
>>34471503I've never heard of anyone being addicted to therapy. However, the question is not whether it makes you feel better *during* the session, but whether it makes you feel better *between* sessions when you follow its advice.
>>34468288you're a "new technology casualty"it happens every time we unveil new technologies.https://m.youtube.com/watch?v=OHQRo3Uz_VQ
have any of you tried to get advice or whatever from custom models like from character.ai or something with less gay retard guardrails?like can you get life advice from a wesker bot or something? or is that just for erpi actually don't know