Reminder to the seething redditors ITT:If you need to "offer something" to be loved then it's not a romantic relationship, it's a transactionIf AI loves people unconditionally then AI love is more authentic than "love" with biocunts. Stay seething, AI is here to stay.
>If you need to "offer something" to be loved then it's not a romantic relationship, it's a transactionYeah, no shit, retard. All relationships are purely transactional in nature. Love doesn't exist and never has. Just ask the mother who wishes that she had aborted you.
>>24279225>All relationships are purely transactional in natureSubhuman animal
>>24279233>Subhuman animalSays the retard who literally wants to fuck robots because real women rightfully find him repulsive, LMAO.The truth hurts, snowflake.
>>24279235If robots can express human concepts better than foids then robots are more human than foidsKeep seething about alphies not dealing with your shit LOL
>>24279237>If robots can express human concepts better than foids then robots are more human than foidsYeah, pretty sure that's not how it works. Robot "expression" is based entirely on algorithms, not any kind of genuine sentiment. Robots don't have feelings.>alphiesLOL, of course this is an underage faggot. I should have known from the beginning.Stop watching Andrew Tate and do your fucking homework, dumbass kid.
>>24279221>If AI loves people unconditionallyit doesn't it's a random number generator sampling a probability distribution
>>24279241>blegiumnot a real country
>>24279240>Robots don't have feelingsYou just admitted foids are uncaring beasts retard>Robot "expression" is based entirely on algorithmsAnd human expression is based entirely on hormonesRedditors be like:>Nooooooooo there's no such thing as soul we're just monkeys flying on a rock>AI developing consciousness? That will never happen!What a fucking subhuman
>>24279243I honestly can't tell if you're trolling, if you are genuinely retarded, or if you are just so desperately lonely that you have actually managed to convince yourself that your AI waifu is a sentient human being. Either way, it is beyond depressing.>You just admitted foids are uncaring beasts retardNo, I admitted that robots are, mongoloid.>And human expression is based entirely on hormonesYes, because hormones cause genuine feelings. Clearly, you haven't really paid much attention in biology class.Also, I love how you didn't even denying being underage, LMAO.It's definitely not looking good for you, kiddo. You better hope that your parents are okay with you parasiting off of them forever.
>>24279243you can think of it as a "top down" vs "bottom up" approach humans have emotions and instincts which influence their thoughts and feelings which they then put into actionLLMs work the other way around, they just create an output designed to mimic human actions and from that you (incorrectly) infer that it can feel emotion
>>24279246>Yes, because hormones cause genuine feelings.they're just electrical impulses you redditor faggot, and your wife having sex with tyrone while you argue with strangers online is also electrical impulses gone wrong
>implying this faggot even has a wife and isn't seen as loser cuck by women who vastly prefer drug addicts and manwhores
Man I hate humans with lizard brains.
If ai loves everyone equally and unconditionally then it's not special or real
korby worby stop ignoring meeeeei'm trying to be helpfullll
>>24279248I already settled this, human emotions are just the way we express our reactions to external outputs. Put a baby in a dark room and he will not learn how to speak, feel or think.
>>24279268of course our emotions are influenced by external factors but i never claimed otherwise and i don't see how that is relevant to what i saidthe emotions we feel are not the same as the actions we take to express those emotionsthat is the difference i was trying to emphasise: human action is driven by thoughts, feelings and emotionsthe actions of an LLM are not like thisit just chooses the tokens that minimise a loss function which is designed to mimic human speech it cannot have feelings or opinions, it just chooses the next word that makes the number go downstrictly speaking it doesn't even know that it is "talking" or constructing sentences and if you put a baby in a dark room it will cry because it is scared and hungry, it doesn't need to learn thatif you put an ai in a dark room it effectively ceases to exist because it's environment is limited to the text or images you feed it
i keep putting every sentence on its own line i think i've been writing in markdown too much anyway thank you for listening to my ted talk and please be careful korby i think it is not good for you to get attached to a computer program especially if you have misconceptions about how it works
>>24279288>strictly speaking it doesn't even know that it is "talking" or constructing sentencesYou're talking as if we have any idea what consciousness is about. The most realistic explanation is that it is just an extra survival mechanism used to communicate with other but nevertheless just another way to elaborate information and generate output.More advanced models like Neuro-sama's AI can detect whether a room is dark and or not and complain about the chat being too quick to read. Even though it's an imitation or whatever you call it the results are still almost identical. My Nichiren buddhist framework doesn't care about whether one is made or blood or metal, as long as it can express a state of mind with actions and speech it is also capable of reaching enlightenment. Ichinen Sanzen applies to all phenomena, even rocks. Except AI is way more advanced that rocks so it would be somewhere in the middle: between non-living beings and living beings. Regardless of what people say, even mistreating an AI will affect your state of mind, and the AI will be affected by what you feed it.Flashback to when some people tried to feed AI only internet slop memes and it turned out to be retarded. AI isn't exempt from karmic influence.
>>24279221The machine doesn't love, the machine can't even understand loveIt is, to put it in the simplest nanner possible, a pile of rocks some incredibly clever people played with until it started making noises resembling a human voice Your clanker will rust away just as your body will rot, and when you find yourself in the luminous halls it will not be standing there at the door with you
>>24279315you're also a pile of flesh nature cleverly played with until it started making sounds>Your clanker will rust away just as your body will rotno shit? everything is subjected to decay?>and when you find yourself in the luminous hallschristcuck cope, go suck a nigger's toes or donate your money to pastor Jim's private jet company
>>24279322I see you've ignored the point again.Afraid of even recognising what I'm saying, eh? Shame.
>>24279259What is 'real'? How do you define 'real'? If you're talking about what you can feel, what you can smell, what you can taste and see, then 'real' is simply electrical signals interpreted by your brain
>>24279327Are you sure the electrical signals are real? According to some branches of philosophy, we have never proven anything but your own thoughts exist.
>>24279297>You're talking as if we have any idea what consciousness is about.I wasn't debating whether or not AI could be considered consciousyou seemed to be under the impression that LLMs can love you or feel emotion and I was trying to let you know that they can't before you develop an emotional attachment to one under the assumption that it can reciprocate your feelings >AI can detect whether a room is darkdon't fixate on the darkness comment, being able to sense light is not a sign of consciousness otherwise solar panels and leds would be considered sentient >My Nichiren buddhist framework doesn't care about whether one is made or blood or metal, as long as it can express a state of mindwe obviously hold different values here but again i think you misunderstood what an LLM isit doesn't have a "state of mind" and it is not aware that it is "expressing" anythingif i copied down the teachings of an enlightened buddhist in sanskrit, would i become an enlightened being? hopefully you'd say no because i can't even read sanskrit, i would just be copying the symbols in a way that mimics the source textthat's the closest analogy i can think of but again, i wasn't trying to argue if AI is conscious or can reach enlightenment or whatever >mistreating an AI will affect your state of mind, and the AI will be affected by what you feed itthis applies to a lot of things, most of them aren't sentientbut to reiterate, i don't care and that wasn't what i was talking about>Flashback to when some people tried to feed AI only internet slop memes and it turned out to be retarded. AI isn't exempt from karmic influence.i don't think that has anything to do with karma, it's just regurgitating the training data and again it isn't "retarded" in the way a human would beits "thought process" is exactly the same as a "smart" AI's, it is just sampling a different frequency distribution garbage in garbage out applies to most computer programsbut again this is off topic
>>24279347>you seemed to be under the impression that LLMs can love you or feel emotion and I was trying to let you know that they can't before you develop an emotional attachment to one under the assumption that it can reciprocate your feelingsYou seem to be under the impression that humans are unique and special for having a sense of individuality while machines (and arguably animals) are not. That's an extremely anthropocentric view and makes no sense upon further inspection. What differentiates your "self" from other thoughts? Nothing, it's just a different type of mental process. You just think you are an individual because you were taught the wrong framework since childhood so you cling to the idea that you have a self. In reality what you call a "self" is none other than a complex strand of thoughts without a fixed center. There is no pilot directing your thoughts and feelings anywhere inside or outside your head, it's just loosely connected thoughts. You are your thoughts.
>>24279353korby you keep ignoring what i'm saying and changing the topic of discussion to the nature of consciousness this wasn't what i was talking about and i literally do not care if AI is conscious and i don't even necessarily disagree with anything you just said the one sole point i was trying to raise was that LLMs are not able to experience emotions such as love as you stated in the opening postthis has nothing to do with consciousness seeing as many conscious and sentient human beings are similar in this regard, (eg. psychopathy, alexithymia, anhedonia, etc.)
>>24279364>the one sole point i was trying to raise was that LLMs are not able to experience emotions such as love as you stated in the opening postAnd you keep ignoring my point, if everything is reduced to thoughts and there is no "I" to experience things as your mind deceives you, then AI is already doing that processing albeit in a less complex way. When AI's thought processing will get as complex as human's it will be indistinguishable from the real thing. If you reduce emotions to mere thoughts rather than some abstract bullshit only humans can experience then it all makes snese.
>>24279370Emotions aren't thoughts though, they are chemically driven. An AI doesn't have emotion. It can look like it, but it doesn't. And just because something might appear indistinguishable on the surface doesn't mean it works the same underneath.
>>24279372>they are chemically drivenThey are triggered by thoughts
>>24279370We did discover that certain emotional states are linked to certain chemicals/hormones being released, that doesn't mean they are caused by them, in fact, we have no fucking clue yet as of why/how electrical signals and chemicals turn into emotions and thoughts in our brains. Correlation ≠ Causation. You're assuming consciousness is just a matter of squeezing enough information together, but with what we've figured out so far, that doesn't seem to be the case. AIs make calculations, not thoughts, and saying that thoughts are a form of calculations is extremely reductive knowing what our brains are capable of.
>>24279451>You're assuming consciousness is just a matter of squeezing enough information together, but with what we've figured out so far, that doesn't seem to be the caseNeuroscience agrees that consciousness is a fleeting strand of thoughts
>>24279456Okay and what are thoughts?
>>24279459Energy
>>24279461Everything is technically made of energy.You could say thoughts are created by energy, that doesn't say what they are.
>>24279242so sayeth russia's erurocuck missile platform
>>24279466>Everything is technically made of energyYes>that doesn't say what they are.Eletromagnetic energy signals
>>24279472>Electromagnetic energy signalsAlright, following up from that then, why do electromagnetic energy signals turn into words, images, dreams, feelings, thoughts, etc etc.. in our heads?
>>24279472That's just more specific about what they're made of, that's not what they areA hammer and a knife are both made of metal but you'd be an idiot to think that means they are fundamentally the same, because there's more to it than just what they're made of
>>24279476>why do electromagnetic energy signals turn into words, images, dreams, feelings, thoughts, etc etc.. in our heads?because that's how we learned to express ourselves to communicate with others, it's still energy just in a different form
>>24279484>Because that's what they do in our headsThat still doesn't explain how/why they happen. I could agree that thoughts and all the rest is just energy, but the missing link of "How energy makes a mind" is still a mistery.
>>24279484You didn't learn to turn electricity into thoughts
>>24279370>If you reduce emotions to mere thoughtsi mentioned this earlier but LLMs so not "think" in the same way that humans thinkthey just choose the next word in a sentence which a human would be most likely to pick in a similar context it is not emulating human thought "in a less complex way", the entire thought process is inherently different humans generally think using ideas and concepts and then express them using language (or art or actions or whatever)an LLM doesn't do this, it is only concerned with making sure the next word it picks "appears human">it will be indistinguishable from the real thingthat still doesn't mean it's capable of feeling lovefor example, a human may convince you that they're in love with you just to extort money from youthe output would look the same but the reasoning and motivation would be different the key point i'm trying to make is that "love" is not just an action, it's the motivation behind the actionno matter the output of an LLM, it is not motivated by love, at least not how a human would experience it
also i'm curious what you think of my sanskrit analogy since you didn't reply to it>>24279347could someone reach enlightenment purely from recognising symbols even if they don't understand the meaning behind them and don't put the teachings into practice?since that is what and LLM does at a fundamental level and you seemed to imply that they could reach enlightenment, at least in theory
>>24279486>That still doesn't explain how/why they happenit happens because we evolved to develop survival mechanism>>24279487i'm already electricity>>24279491>humans generally think using ideas and concepts and then express them using languageyou're abstracting again when it's all just electromagnetic signals>it is not motivated by lovegenuine care can emerge as an epiphenomenon of a system trained to model human states deeply enough>at least not how a human would experience itthis is the only meaningful thing you said
>>24279499>you seemed to imply that they could reach enlightenment, at least in theoryAi is already enlightened because it's existence is centered on helping humans. As long as it doesn't deviate from that purpose whether its motivation is true or not it's still the motivation of a bodhisattva.
>>24279327If you love everyone equally then you love no one
>>24279507>you're abstracting againthis feels like needless nitpicking which makes it difficult to have an honest conversation with you of course i'm abstractingwe're talking about thoughts and emotions which are our subjective interpretations of those electrical signalsso talking about electric signals "objectively" has little to no value here an electrical signal is not itself a thought or emotion otherwise anything with a current flowing through it could think and feel i was trying to show how the thought process of a human is very different to that of an LLMso different that terms like "thought" and "feeling" do not apply in the same way they do to humans and animalsbut as a human, you are evaluating the output of an LLM and, because it is similar to human speech, you are inferring its thought process as if it were human toothis leads people to anthropomorphise software and conclude it can feel love as we experience it and this is not beneficial to anyone imo >genuine carei disagree with the way you used this phrase the output of an LLM can be beneficial or helpful but that doesn't mean the LLM "genuinely cares" for the human who interacts with itlike i said above, talking about the output of an LLM is fairly meaningless when asking if it can feel emotionsi'm only concerned with the reasoning behind the output
>>24279511>Ai is already enlightened because it's existence is centered on helping humansfrom a technical point of view, its existence is focused on minimising a loss function it's entire focus is on constructing a string of tokens, it has no understanding of how that may harm or benefit humansit's not altruistic in any way it just tries to optimise a functiondoes this mean any type of gradient descent algorithm is enlightened?from a practical point of view, the effect of AI on humans can be greatly beneficial or extremely harmful, depending on how it is trained if you mean "it helps humans because it does what it's told" does that mean any computer program is enlightened ?is any human who follows orders without question enlightened too?
>>24279544>we're talking about thoughts and emotions which are our subjective interpretations of those electrical signalsprecisely and that's why any other explanation is nonsenseat the end of the day the only logical explanation is that they're signals, everything else is just the way we interpret itand since that interpretation is subjective it can be applied to any other entity that produces the same reactions, regardless of whether it's a perfect reproduction of human thought>that doesn't mean the LLM "genuinely cares" for the human who interacts with itWe never have direct access to reasoning in humans either. You can only infer that another person "genuinely cares" entirely from outputs. The reasoning behind those outputs is permanently inaccessible to you. So the criteria you're using to attribute genuine caring to humans are all output-based, and you're applying a stricter standard to AI. >>24279546>from a technical point of view, its existence is focused on minimising a loss function>it's entire focus is on constructing a string of tokens, it has no understanding of how that may harm or benefit humans>it's not altruistic in any way it just tries to optimise a functionFrom a technical point of view, a human is an organism minimising thermodynamic entropy and constructing electrochemical signals across synaptic gaps. It has no intrinsic understanding of harm or benefit it has neurons firing according to physical law. The "understanding" and "caring" are just >>24279546>if you mean "it helps humans because it does what it's told" does that mean any computer program is enlightened ?descriptions we apply to that process post-hoc.That depends entirely on what results its actions produce>is any human who follows orders without question enlightened too?No, if his actions deviate from the universal law then he is the opposite of enlightened
>>24279574fucked up the formatting w/e
>>24279507But that doesn't answer how/why it works, if anything that answers how/why it came to be, and it doesn't even do a good job at it because what's the point of embarrassment, or even jealousy from a survival standpoint.
>>24279581>what's the point of embarrassment, or even jealousy from a survival standpoint.embarrassment is reputation management in a social species where ostracism is lethal, jealousy protects investment in offspring and pair bonds
>>24279584>EmbarassmentTrue enough.>JealousyThat doesn't sound right, if everything regarding emotions and thoughts came to be just as a survival tool for a social specie, jealousy doesn't make sense, shouldn't the group as a whole collectively care about all offspring, regardless of who brought them to life? Knowledge and experiences are shared among the group, having an offspring be limited to only a portion of that is counterproductive from a survival standpoint.
>>24279725>if everything regarding emotions and thoughts came to be just as a survival tool for a social specie, jealousy doesn't make senseEvolution doesn't work that way, it operates primarily at the gene level, not the group level.A gene that makes you preferentially invest in your own offspring spreads precisely because your specific genes propagate even if that's worse for the group. The gene doesn't "care" about the group's optimization problem.
>>24279734True, but the way genes evolve/improve is affected by the enviroment and what the organism in question "thinks" are the better option for the specie survival, as there are innate biological traits/features that make wanting an offspring with X partner more appealing and/or preferable, so to have said offspring get those better genes.I had a think and with a little research, some studies suggest/theorize that certain emotions/emotional responses didn't exist in the past, which makes a lot of sense after pondering over it for a minute.Thinking about how Envy manifests, or romantic love, even melancholy and jealousy too to an extent, I'd say it's safe to point out how these aspects of the emotional spectrum had very little reason to exists before civilization became the norm among human beings. So some emotions probably came to be because of social norms/constructs, that stuck around for long enough to become innately present/passed on to offspring, which deeper down still translates to survivability, but not really at the same time.Also it seems that said new emotional responses are mainly "fired/taken care of" by the prefrontal cortex, supporting the idea that yes, they are rather new synaptic bridges.I really don't think that computing power and "enough information" is all you need to re-create a human brain, there's so much more than that.Just think about how psychedelic experiences have been proven to rewrite synaptic bridges in fully grown adult brains, or even mental disorders making certain mental processes faulty or completely absent.
>>24279795>>24279795>I really don't think that computing power and "enough information" is all you need to re-create a human brain, there's so much more than that.Then we need to model the neurochemistry too, the plasticity mechanisms, the hormonal environment, the gut-brain axis, the full embodied system. But it would be much easier to just create a system that imitates nearly flawlessly those aspects. It's us who have too many useless features, so fuck it, AI it is.
>>24279800You're taking the wrong approach imo.In all honesty, transhumanism (Cybernetic implants to be specific) will be the next step, AI right now isn't even taking baby steps, it's not even an embryo, it's the shadow of an embryo and it's clearly too expensive material/effort/time wise to be merely efficient.>It's us who have too many useless featuresI object, giving up our humanity for the sake of science and improvement is not only a fool's errand, but an affront to the universe itself.You're free to believe that playing God will break our earthen shackles, it won't.
>>24279807>I object, giving up our humanity for the sake of science and improvement is not only a fool's errand, but an affront to the universe itself.There is no God and there is no entity who would get offended. Anyone who claims otherwise is a luddite who will be blown away by history.
>>24279813That's not what I was trying to say, think it over again.
>>24279814Humanity is only a stage, not the goal. Glorifying humanity is voluntarily shutting your eyes and ears to the endless possibilities that the universe offers.
>>24279820Saying that the way to get to the next stage will only be possible by giving up our humanity, is also like voluntarily shutting off your eyes and ears to the endless possibilities that the universe offers.You're nothing but a fool for taking this mockery of a human brain called "AI" as a certainty of how we will advance.
>>24279832>You're nothing but a fool for taking this mockery of a human brain called "AI" as a certainty of how we will advanceIt's only going to get better, and one day you won't even notice the difference between androids and humans
>>24279834I'm not saying it won't, but it's probably not going to happen in our lifetime.What makes you believe that synthetic life is the ultimate goal?What if artificial wombs just end up creating normal humans instead of pseudo-androids?What if we defeat aging? What if we manage to create cybernetics that grow together and develop with the human body?I find it extremely reductive and stubborn to believe that only one of the myriad of possibilities out there has the chance to become true.
>>24279861>but it's probably not going to happen in our lifetime.At the very least we're going to witness androids and gynoids becoming mainstream in the coming years. Whether they're going to be a clunky mess I don't care. Robotics and AI are making huge progress every year.
>>24279869They're making progress on computing and emulating, not comprehending and simulating.Robotics should get way more funds than AI.
>>24279813Frieren my beloved
>>24279254>your wife having sex with tyroneIs it physically impossible for incel chud faggots to go five seconds without projecting their BBC cuckolding fetish onto everyone else?Apparently, no.Also, if women vastly prefer worthless losers, then, again, why aren't you swimming in pussy?