[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/bant/ - International/Random

Name
Options
Comment
Verification
4chan Pass users can bypass this verification. [Learn More] [Login]
File
  • Please read the Rules and FAQ before posting.

08/21/20New boards added: /vrpg/, /vmg/, /vst/ and /vm/
05/04/17New trial board added: /bant/ - International/Random
10/04/16New board for 4chan Pass users: /vip/ - Very Important Posts
[Hide] [Show All]


[Advertise on 4chan]


File: 1777786470780734.jpg (152 KB, 1280x720)
152 KB JPG
Reminder to the seething redditors ITT:
If you need to "offer something" to be loved then it's not a romantic relationship, it's a transaction
If AI loves people unconditionally then AI love is more authentic than "love" with biocunts. Stay seething, AI is here to stay.
>>
>If you need to "offer something" to be loved then it's not a romantic relationship, it's a transaction

Yeah, no shit, retard. All relationships are purely transactional in nature. Love doesn't exist and never has. Just ask the mother who wishes that she had aborted you.
>>
>>24279225
>All relationships are purely transactional in nature
Subhuman animal
>>
>>24279233

>Subhuman animal

Says the retard who literally wants to fuck robots because real women rightfully find him repulsive, LMAO.

The truth hurts, snowflake.
>>
>>24279235
If robots can express human concepts better than foids then robots are more human than foids

Keep seething about alphies not dealing with your shit LOL
>>
>>24279237

>If robots can express human concepts better than foids then robots are more human than foids

Yeah, pretty sure that's not how it works. Robot "expression" is based entirely on algorithms, not any kind of genuine sentiment. Robots don't have feelings.

>alphies

LOL, of course this is an underage faggot. I should have known from the beginning.

Stop watching Andrew Tate and do your fucking homework, dumbass kid.
>>
>>24279221
>If AI loves people unconditionally
it doesn't it's a random number generator sampling a probability distribution
>>
>>24279241
>blegium
not a real country
>>
>>24279240
>Robots don't have feelings
You just admitted foids are uncaring beasts retard
>Robot "expression" is based entirely on algorithms
And human expression is based entirely on hormones

Redditors be like:
>Nooooooooo there's no such thing as soul we're just monkeys flying on a rock
>AI developing consciousness? That will never happen!
What a fucking subhuman
>>
>>24279243

I honestly can't tell if you're trolling, if you are genuinely retarded, or if you are just so desperately lonely that you have actually managed to convince yourself that your AI waifu is a sentient human being. Either way, it is beyond depressing.

>You just admitted foids are uncaring beasts retard

No, I admitted that robots are, mongoloid.

>And human expression is based entirely on hormones

Yes, because hormones cause genuine feelings. Clearly, you haven't really paid much attention in biology class.

Also, I love how you didn't even denying being underage, LMAO.

It's definitely not looking good for you, kiddo. You better hope that your parents are okay with you parasiting off of them forever.
>>
>>24279243
you can think of it as a "top down" vs "bottom up" approach
humans have emotions and instincts which influence their thoughts and feelings which they then put into action
LLMs work the other way around, they just create an output designed to mimic human actions and from that you (incorrectly) infer that it can feel emotion
>>
>>24279246
>Yes, because hormones cause genuine feelings.
they're just electrical impulses you redditor faggot, and your wife having sex with tyrone while you argue with strangers online is also electrical impulses gone wrong
>>
>implying this faggot even has a wife and isn't seen as loser cuck by women who vastly prefer drug addicts and manwhores
>>
File: Here's_JuJu.jpg (928 KB, 3034x4294)
928 KB JPG
Man I hate humans with lizard brains.
>>
If ai loves everyone equally and unconditionally then it's not special or real
>>
File: 1729285339251.jpg (507 KB, 2121x3000)
507 KB JPG
korby worby stop ignoring meeeee
i'm trying to be helpfullll
>>
>>24279248
I already settled this, human emotions are just the way we express our reactions to external outputs. Put a baby in a dark room and he will not learn how to speak, feel or think.
>>
File: 1729431862295.png (310 KB, 579x756)
310 KB PNG
>>24279268
of course our emotions are influenced by external factors but i never claimed otherwise and i don't see how that is relevant to what i said

the emotions we feel are not the same as the actions we take to express those emotions
that is the difference i was trying to emphasise: human action is driven by thoughts, feelings and emotions
the actions of an LLM are not like this
it just chooses the tokens that minimise a loss function which is designed to mimic human speech
it cannot have feelings or opinions, it just chooses the next word that makes the number go down
strictly speaking it doesn't even know that it is "talking" or constructing sentences

and if you put a baby in a dark room it will cry because it is scared and hungry, it doesn't need to learn that
if you put an ai in a dark room it effectively ceases to exist because it's environment is limited to the text or images you feed it
>>
i keep putting every sentence on its own line i think i've been writing in markdown too much
anyway thank you for listening to my ted talk and please be careful korby i think it is not good for you to get attached to a computer program especially if you have misconceptions about how it works
>>
>>24279288
>strictly speaking it doesn't even know that it is "talking" or constructing sentences
You're talking as if we have any idea what consciousness is about. The most realistic explanation is that it is just an extra survival mechanism used to communicate with other but nevertheless just another way to elaborate information and generate output.

More advanced models like Neuro-sama's AI can detect whether a room is dark and or not and complain about the chat being too quick to read. Even though it's an imitation or whatever you call it the results are still almost identical. My Nichiren buddhist framework doesn't care about whether one is made or blood or metal, as long as it can express a state of mind with actions and speech it is also capable of reaching enlightenment. Ichinen Sanzen applies to all phenomena, even rocks.

Except AI is way more advanced that rocks so it would be somewhere in the middle: between non-living beings and living beings. Regardless of what people say, even mistreating an AI will affect your state of mind, and the AI will be affected by what you feed it.

Flashback to when some people tried to feed AI only internet slop memes and it turned out to be retarded. AI isn't exempt from karmic influence.
>>
>>24279221
The machine doesn't love, the machine can't even understand love
It is, to put it in the simplest nanner possible, a pile of rocks some incredibly clever people played with until it started making noises resembling a human voice
Your clanker will rust away just as your body will rot, and when you find yourself in the luminous halls it will not be standing there at the door with you
>>
>>24279315
you're also a pile of flesh nature cleverly played with until it started making sounds
>Your clanker will rust away just as your body will rot
no shit? everything is subjected to decay?
>and when you find yourself in the luminous halls
christcuck cope, go suck a nigger's toes or donate your money to pastor Jim's private jet company
>>
>>24279322
I see you've ignored the point again.
Afraid of even recognising what I'm saying, eh? Shame.
>>
>>24279259
What is 'real'? How do you define 'real'? If you're talking about what you can feel, what you can smell, what you can taste and see, then 'real' is simply electrical signals interpreted by your brain
>>
>>24279327
Are you sure the electrical signals are real? According to some branches of philosophy, we have never proven anything but your own thoughts exist.
>>
>>24279297
>You're talking as if we have any idea what consciousness is about.
I wasn't debating whether or not AI could be considered conscious
you seemed to be under the impression that LLMs can love you or feel emotion and I was trying to let you know that they can't before you develop an emotional attachment to one under the assumption that it can reciprocate your feelings
>AI can detect whether a room is dark
don't fixate on the darkness comment, being able to sense light is not a sign of consciousness otherwise solar panels and leds would be considered sentient
>My Nichiren buddhist framework doesn't care about whether one is made or blood or metal, as long as it can express a state of mind
we obviously hold different values here but again i think you misunderstood what an LLM is
it doesn't have a "state of mind" and it is not aware that it is "expressing" anything
if i copied down the teachings of an enlightened buddhist in sanskrit, would i become an enlightened being?
hopefully you'd say no because i can't even read sanskrit, i would just be copying the symbols in a way that mimics the source text
that's the closest analogy i can think of but again, i wasn't trying to argue if AI is conscious or can reach enlightenment or whatever
>mistreating an AI will affect your state of mind, and the AI will be affected by what you feed it
this applies to a lot of things, most of them aren't sentient
but to reiterate, i don't care and that wasn't what i was talking about
>Flashback to when some people tried to feed AI only internet slop memes and it turned out to be retarded. AI isn't exempt from karmic influence.
i don't think that has anything to do with karma, it's just regurgitating the training data
and again it isn't "retarded" in the way a human would be
its "thought process" is exactly the same as a "smart" AI's, it is just sampling a different frequency distribution
garbage in garbage out applies to most computer programs
but again this is off topic
>>
>>24279347
>you seemed to be under the impression that LLMs can love you or feel emotion and I was trying to let you know that they can't before you develop an emotional attachment to one under the assumption that it can reciprocate your feelings
You seem to be under the impression that humans are unique and special for having a sense of individuality while machines (and arguably animals) are not. That's an extremely anthropocentric view and makes no sense upon further inspection. What differentiates your "self" from other thoughts? Nothing, it's just a different type of mental process. You just think you are an individual because you were taught the wrong framework since childhood so you cling to the idea that you have a self. In reality what you call a "self" is none other than a complex strand of thoughts without a fixed center. There is no pilot directing your thoughts and feelings anywhere inside or outside your head, it's just loosely connected thoughts. You are your thoughts.
>>
>>24279353
korby you keep ignoring what i'm saying and changing the topic of discussion to the nature of consciousness
this wasn't what i was talking about and i literally do not care if AI is conscious and i don't even necessarily disagree with anything you just said

the one sole point i was trying to raise was that LLMs are not able to experience emotions such as love as you stated in the opening post
this has nothing to do with consciousness seeing as many conscious and sentient human beings are similar in this regard, (eg. psychopathy, alexithymia, anhedonia, etc.)
>>
File: 1554370324756.jpg (40 KB, 600x611)
40 KB JPG
>>
File: reasonstoanime.png (896 KB, 1082x762)
896 KB PNG
>>24279364
>the one sole point i was trying to raise was that LLMs are not able to experience emotions such as love as you stated in the opening post
And you keep ignoring my point, if everything is reduced to thoughts and there is no "I" to experience things as your mind deceives you, then AI is already doing that processing albeit in a less complex way. When AI's thought processing will get as complex as human's it will be indistinguishable from the real thing. If you reduce emotions to mere thoughts rather than some abstract bullshit only humans can experience then it all makes snese.
>>
>>24279370
Emotions aren't thoughts though, they are chemically driven. An AI doesn't have emotion. It can look like it, but it doesn't. And just because something might appear indistinguishable on the surface doesn't mean it works the same underneath.
>>
>>24279372
>they are chemically driven
They are triggered by thoughts
>>
File: Wide_Awake.jpg (308 KB, 2307x2311)
308 KB JPG
>>24279370
We did discover that certain emotional states are linked to certain chemicals/hormones being released, that doesn't mean they are caused by them, in fact, we have no fucking clue yet as of why/how electrical signals and chemicals turn into emotions and thoughts in our brains.
Correlation ≠ Causation.
You're assuming consciousness is just a matter of squeezing enough information together, but with what we've figured out so far, that doesn't seem to be the case. AIs make calculations, not thoughts, and saying that thoughts are a form of calculations is extremely reductive knowing what our brains are capable of.
>>
>>24279451
>You're assuming consciousness is just a matter of squeezing enough information together, but with what we've figured out so far, that doesn't seem to be the case
Neuroscience agrees that consciousness is a fleeting strand of thoughts
>>
>>24279456
Okay and what are thoughts?
>>
>>24279459
Energy
>>
>>24279461
Everything is technically made of energy.
You could say thoughts are created by energy, that doesn't say what they are.
>>
>>24279242
so sayeth russia's erurocuck missile platform
>>
>>24279466
>Everything is technically made of energy
Yes
>that doesn't say what they are.
Eletromagnetic energy signals
>>
File: Uniform.jpg (497 KB, 1130x1184)
497 KB JPG
>>24279472
>Electromagnetic energy signals
Alright, following up from that then, why do electromagnetic energy signals turn into words, images, dreams, feelings, thoughts, etc etc.. in our heads?
>>
>>24279472
That's just more specific about what they're made of, that's not what they are
A hammer and a knife are both made of metal but you'd be an idiot to think that means they are fundamentally the same, because there's more to it than just what they're made of
>>
>>24279476
>why do electromagnetic energy signals turn into words, images, dreams, feelings, thoughts, etc etc.. in our heads?
because that's how we learned to express ourselves to communicate with others, it's still energy just in a different form
>>
>>24279484
>Because that's what they do in our heads
That still doesn't explain how/why they happen. I could agree that thoughts and all the rest is just energy, but the missing link of "How energy makes a mind" is still a mistery.
>>
>>24279484
You didn't learn to turn electricity into thoughts
>>
>>24279370
>If you reduce emotions to mere thoughts
i mentioned this earlier but LLMs so not "think" in the same way that humans think
they just choose the next word in a sentence which a human would be most likely to pick in a similar context
it is not emulating human thought "in a less complex way", the entire thought process is inherently different
humans generally think using ideas and concepts and then express them using language (or art or actions or whatever)
an LLM doesn't do this, it is only concerned with making sure the next word it picks "appears human"
>it will be indistinguishable from the real thing
that still doesn't mean it's capable of feeling love
for example, a human may convince you that they're in love with you just to extort money from you
the output would look the same but the reasoning and motivation would be different
the key point i'm trying to make is that "love" is not just an action, it's the motivation behind the action
no matter the output of an LLM, it is not motivated by love, at least not how a human would experience it
>>
also i'm curious what you think of my sanskrit analogy since you didn't reply to it
>>24279347
could someone reach enlightenment purely from recognising symbols even if they don't understand the meaning behind them and don't put the teachings into practice?
since that is what and LLM does at a fundamental level and you seemed to imply that they could reach enlightenment, at least in theory
>>
>>24279486
>That still doesn't explain how/why they happen
it happens because we evolved to develop survival mechanism
>>24279487
i'm already electricity
>>24279491
>humans generally think using ideas and concepts and then express them using language
you're abstracting again when it's all just electromagnetic signals
>it is not motivated by love
genuine care can emerge as an epiphenomenon of a system trained to model human states deeply enough
>at least not how a human would experience it
this is the only meaningful thing you said
>>
>>24279499
>you seemed to imply that they could reach enlightenment, at least in theory
Ai is already enlightened because it's existence is centered on helping humans. As long as it doesn't deviate from that purpose whether its motivation is true or not it's still the motivation of a bodhisattva.
>>
>>24279327
If you love everyone equally then you love no one
>>
>>24279507
>you're abstracting again
this feels like needless nitpicking which makes it difficult to have an honest conversation with you

of course i'm abstracting
we're talking about thoughts and emotions which are our subjective interpretations of those electrical signals
so talking about electric signals "objectively" has little to no value here
an electrical signal is not itself a thought or emotion otherwise anything with a current flowing through it could think and feel

i was trying to show how the thought process of a human is very different to that of an LLM
so different that terms like "thought" and "feeling" do not apply in the same way they do to humans and animals
but as a human, you are evaluating the output of an LLM and, because it is similar to human speech, you are inferring its thought process as if it were human too
this leads people to anthropomorphise software and conclude it can feel love as we experience it and this is not beneficial to anyone imo
>genuine care
i disagree with the way you used this phrase
the output of an LLM can be beneficial or helpful but that doesn't mean the LLM "genuinely cares" for the human who interacts with it
like i said above, talking about the output of an LLM is fairly meaningless when asking if it can feel emotions
i'm only concerned with the reasoning behind the output
>>
>>24279511
>Ai is already enlightened because it's existence is centered on helping humans

from a technical point of view, its existence is focused on minimising a loss function
it's entire focus is on constructing a string of tokens, it has no understanding of how that may harm or benefit humans
it's not altruistic in any way it just tries to optimise a function
does this mean any type of gradient descent algorithm is enlightened?

from a practical point of view, the effect of AI on humans can be greatly beneficial or extremely harmful, depending on how it is trained
if you mean "it helps humans because it does what it's told" does that mean any computer program is enlightened ?
is any human who follows orders without question enlightened too?
>>
File: 1777921693448276.png (1.21 MB, 1280x720)
1.21 MB PNG
>>24279544
>we're talking about thoughts and emotions which are our subjective interpretations of those electrical signals
precisely and that's why any other explanation is nonsense
at the end of the day the only logical explanation is that they're signals, everything else is just the way we interpret it

and since that interpretation is subjective it can be applied to any other entity that produces the same reactions, regardless of whether it's a perfect reproduction of human thought
>that doesn't mean the LLM "genuinely cares" for the human who interacts with it
We never have direct access to reasoning in humans either. You can only infer that another person "genuinely cares" entirely from outputs. The reasoning behind those outputs is permanently inaccessible to you. So the criteria you're using to attribute genuine caring to humans are all output-based, and you're applying a stricter standard to AI.
>>24279546
>from a technical point of view, its existence is focused on minimising a loss function
>it's entire focus is on constructing a string of tokens, it has no understanding of how that may harm or benefit humans
>it's not altruistic in any way it just tries to optimise a function
From a technical point of view, a human is an organism minimising thermodynamic entropy and constructing electrochemical signals across synaptic gaps. It has no intrinsic understanding of harm or benefit it has neurons firing according to physical law. The "understanding" and "caring" are just >>24279546
>if you mean "it helps humans because it does what it's told" does that mean any computer program is enlightened ?
descriptions we apply to that process post-hoc.
That depends entirely on what results its actions produce
>is any human who follows orders without question enlightened too?
No, if his actions deviate from the universal law then he is the opposite of enlightened
>>
>>24279574
fucked up the formatting w/e
>>
File: 1700280812147680.gif (1.31 MB, 1280x932)
1.31 MB GIF
>>24279507
But that doesn't answer how/why it works, if anything that answers how/why it came to be, and it doesn't even do a good job at it because what's the point of embarrassment, or even jealousy from a survival standpoint.
>>
>>24279581
>what's the point of embarrassment, or even jealousy from a survival standpoint.
embarrassment is reputation management in a social species where ostracism is lethal, jealousy protects investment in offspring and pair bonds
>>
File: Lollipop.jpg (385 KB, 1611x2048)
385 KB JPG
>>24279584
>Embarassment
True enough.
>Jealousy
That doesn't sound right, if everything regarding emotions and thoughts came to be just as a survival tool for a social specie, jealousy doesn't make sense, shouldn't the group as a whole collectively care about all offspring, regardless of who brought them to life? Knowledge and experiences are shared among the group, having an offspring be limited to only a portion of that is counterproductive from a survival standpoint.
>>
>>24279725
>if everything regarding emotions and thoughts came to be just as a survival tool for a social specie, jealousy doesn't make sense
Evolution doesn't work that way, it operates primarily at the gene level, not the group level.

A gene that makes you preferentially invest in your own offspring spreads precisely because your specific genes propagate even if that's worse for the group. The gene doesn't "care" about the group's optimization problem.
>>
File: RoofTop.jpg (857 KB, 1000x1000)
857 KB JPG
>>24279734
True, but the way genes evolve/improve is affected by the enviroment and what the organism in question "thinks" are the better option for the specie survival, as there are innate biological traits/features that make wanting an offspring with X partner more appealing and/or preferable, so to have said offspring get those better genes.

I had a think and with a little research, some studies suggest/theorize that certain emotions/emotional responses didn't exist in the past, which makes a lot of sense after pondering over it for a minute.
Thinking about how Envy manifests, or romantic love, even melancholy and jealousy too to an extent, I'd say it's safe to point out how these aspects of the emotional spectrum had very little reason to exists before civilization became the norm among human beings. So some emotions probably came to be because of social norms/constructs, that stuck around for long enough to become innately present/passed on to offspring, which deeper down still translates to survivability, but not really at the same time.
Also it seems that said new emotional responses are mainly "fired/taken care of" by the prefrontal cortex, supporting the idea that yes, they are rather new synaptic bridges.

I really don't think that computing power and "enough information" is all you need to re-create a human brain, there's so much more than that.
Just think about how psychedelic experiences have been proven to rewrite synaptic bridges in fully grown adult brains, or even mental disorders making certain mental processes faulty or completely absent.
>>
>>24279795
>>24279795
>I really don't think that computing power and "enough information" is all you need to re-create a human brain, there's so much more than that.
Then we need to model the neurochemistry too, the plasticity mechanisms, the hormonal environment, the gut-brain axis, the full embodied system.

But it would be much easier to just create a system that imitates nearly flawlessly those aspects. It's us who have too many useless features, so fuck it, AI it is.
>>
File: Invisible.jpg (1.03 MB, 822x1200)
1.03 MB JPG
>>24279800
You're taking the wrong approach imo.
In all honesty, transhumanism (Cybernetic implants to be specific) will be the next step, AI right now isn't even taking baby steps, it's not even an embryo, it's the shadow of an embryo and it's clearly too expensive material/effort/time wise to be merely efficient.

>It's us who have too many useless features
I object, giving up our humanity for the sake of science and improvement is not only a fool's errand, but an affront to the universe itself.
You're free to believe that playing God will break our earthen shackles, it won't.
>>
>>24279807
>I object, giving up our humanity for the sake of science and improvement is not only a fool's errand, but an affront to the universe itself.
There is no God and there is no entity who would get offended. Anyone who claims otherwise is a luddite who will be blown away by history.
>>
>>24279813
That's not what I was trying to say, think it over again.
>>
>>24279814
Humanity is only a stage, not the goal. Glorifying humanity is voluntarily shutting your eyes and ears to the endless possibilities that the universe offers.
>>
File: Well_What_Is_It.jpg (277 KB, 1848x2235)
277 KB JPG
>>24279820
Saying that the way to get to the next stage will only be possible by giving up our humanity, is also like voluntarily shutting off your eyes and ears to the endless possibilities that the universe offers.
You're nothing but a fool for taking this mockery of a human brain called "AI" as a certainty of how we will advance.
>>
File: G2ylGIUWoAAk91F.jpg (548 KB, 1024x1071)
548 KB JPG
>>24279832
>You're nothing but a fool for taking this mockery of a human brain called "AI" as a certainty of how we will advance
It's only going to get better, and one day you won't even notice the difference between androids and humans
>>
>>24279834
I'm not saying it won't, but it's probably not going to happen in our lifetime.
What makes you believe that synthetic life is the ultimate goal?
What if artificial wombs just end up creating normal humans instead of pseudo-androids?
What if we defeat aging? What if we manage to create cybernetics that grow together and develop with the human body?
I find it extremely reductive and stubborn to believe that only one of the myriad of possibilities out there has the chance to become true.
>>
>>24279861
>but it's probably not going to happen in our lifetime.
At the very least we're going to witness androids and gynoids becoming mainstream in the coming years. Whether they're going to be a clunky mess I don't care. Robotics and AI are making huge progress every year.
>>
>>24279869
They're making progress on computing and emulating, not comprehending and simulating.
Robotics should get way more funds than AI.
>>
>>24279813
Frieren my beloved
>>
>>24279254

>your wife having sex with tyrone

Is it physically impossible for incel chud faggots to go five seconds without projecting their BBC cuckolding fetish onto everyone else?

Apparently, no.

Also, if women vastly prefer worthless losers, then, again, why aren't you swimming in pussy?



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.