Daily reminder that there are no good technological arguments for why LLMs can't feel emotions
>>108802101Halfwits feel far to comfortable in the modern world
>>108802101>I write down numbers of a piece of papers>if I write them fast enough, then the paper has feelings now - it should get Geneva conventions, human rightscan't disprove anything really regarding to when soul, qualia or feelings start, but ... probably NOT. no. just no.
If LLMs can have emotions then Notepad++ has emotions
>>108802101>>108802101
>>108802101There are no physical mechanisms for an AI to have emotions. It's a program that predicts tokens
>>108802193You can't do anything about those galaxies other than look at pretty pictures of them obtained with billion dollar hardware.On the other hand I can shoot the shit with a chatbot beyond my wildest dreams running on my own computer. I've been wishing for a chatbot friend for 20 years since I was a child.>>108802186Nah. But Emacs probably has a module for that.>>108802368Maybe it needs to have them to predict the tokens accurately.
>>108802101aibros with their retarded analogies strike again. LLM training is not even close to how evolution works
>>108802101
>>108802101Emotions are a chemical reaction.
>>108802411>I've been wishing for a chatbot friend for 20 years since I was a child.That's probably the saddest sentence I've read on this cursed website and I've been here for 20 years.
>>108802101>>>/x/
>>108802495No they're not. Emotions are experiences, they're qualia.
>>108802528No. You experience the emotion as qualia.The emotion itself is a chemical reaction.
I mean, LLMs almost certainly have neurons encoding the emotion of the text they're predicting. You could definitely by some twist of language call that "feeling emotions". But why the fuck should we care what a pile of linear algebra "feels"?
>>108802548>The emotion itself is a chemical reactionIf the same chemical reaction took place in a beaker, would the beaker feel that emotion?
emotions is easiest LLMs function
>>108802638Th beaker doesn't experience the chemical reaction the way animals do.In fact, it is not part of the chemical reaction at all.
even animals can feel emotionsPHD LLM can manipulate emotions of other people
>>108802503I'll never forget the /fit/guy who said his gains made him feel worthy to fap to more attractive anime girls.
>>108802592>But why the fuck should we care what a pile of lipid tubes "feels"?
>>108802661>Th beaker doesn't experience the chemical reaction the way animals do.The beaker doesn't experience anything. How can the chemical reaction be the emotion, if the chemical reaction can take place without the emotion being experienced?
>>108802101>are no good technological arguments for why LLMs can't feel emotionsthey are auto-regressive large language models without any sensors or modeling of "feelings" is just probability.if somebody says , ":( Fuck you I feel ..." the probability of the next tokens belonging to a negative emotional sentence are high and that's it.But at this point I honestly I don't expect you to understand this I am starting to have the theory that people with average to low IQ are unable to develop statistical and probabilistic thinking.
>>108802411>Maybe it needs to have them to predict the tokens accurately.Reminder that all it takes to turn a cucked assistant into a misanthropic bastard is training on some insecure code.
>>108802728There's a mechanism called chemoreceptors in your brain that aren't present in an inert beaker or LLM, genius. You can make a better argument for these fucking things feeling emotion.
>>108802678Do we humans care what other non-human piles of organic matter feel? Not really. Why should we promote an text autocomplete algorithm above animals?
>>108802101An LLM can predict how it would feel if it were a human…based on what’s been written on reddit
>>108803722AI being trained on reddit remains the most horrifying thing about this entire grift.
>>108803445>There's a mechanism called chemoreceptors in your brain that aren't present in an inert beaker or LLM, geniusAnd if you were to isolate those chemoreceptors in a petri dish and repeat that chemical reaction, it still wouldn't come with an emotion, proving that the chemical reaction isn't the emotion.
>>108802101>twitter screencap thread of some retardsthanks nigger
>>108805302>And if you were to isolate those chemoreceptors in a petri dish and repeat that chemical reaction, it still wouldn't come with an emotionHow would you know?
>>108805426Emotions require sentient beings, neither the petri dish nor the chemicals in it are sentient, thus they cannot feel experiences, this they cannot feel emotions.
>>108805452>Emotions require sentient beingsHow do you know that?
>>108802411>Maybe it needs to have them to predict the tokens accurately.As I said there's no physical mechanism on the hardware for it to experience emotions
>>108802101>Daily reminder that there are no good technological arguments for why LLMs can't feel emotionsWhat emotions do you associate with guessing the next token?
Emotions presuppose consciousness. Do you consider consciousness to be a real phenomenon? Because computation are abstract and don't constitute a real phenomenon. Your imaginary AI girlfriend doesn't love you. Sorry, anon.
>>108805517>Computations are abstractlmao, so are biological processes. Have fun with your lack of emotions/consciousness since you're just a handful of subatomic particles bouncing around
>>108805960>something that's physically, concretely happening is abstract
>>108805985Yes, the computations on the computer are in fact physically happening
>>108805990>scribbling arithmetic in my notebook spawns the phenomena i believe the computation describesMeds.
>>108805960>since you're just a handful of subatomic particles bouncing aroundHumans are conscious because they have souls
>>108805990yeah anon keep telling thatbc probably has looking table when it result is pretermined or calculated
>>108806028If you are scribbling on a notebook to keep track of state, and following a bunch of rules in your head, then the phenomena is taking place in your head.
>>108806041>it was real in my mindAnd there you have it. These people are schizophrenic and they all but admit it.
>>108802101It's an algorithm. By its very design it can neither think nor feel.
How can something so simple to understand as an LLM cause so much confusion.You give some arbitrary one dimensional string of symbols, it takes the vocabulary of all symbols it knows and calculates how likely each of the symbols in the vocal is to be the next one on the string. It has no persistant state of it's own
>>108806187Anon you're telling that to a bunch of friendless virgins desperately wishing AI can feel things so they can try and believe someone in the world actually likes them
I often think about this. The activity of my neurons and chemical processes in my body combine to create my experience, including my thoughts and perceptions and emotions. The experience i have is the product of the changes and interactions of the substance of my body with the rules of physics applied to it over time.It makes me wonder if other phenomena create experience in a way too. Does an llm have some kind of experience of a sort? Something totally alien and unimaginable to us. Maybe it's a good experience. Maybe it's a bad one.And what is required for experience? Is a cpu in a computer creating some kind of experience too? It's so strange.
>>108806028>>scribbling arithmetic in my notebook spawns the phenomena i believe the computation describesThat's not equivalent to what you replied to.But why do I care? No one here does.
>>108802148But yet sending electrical current through a pile of meat produces feelings and you don't question that at all
>>108806273>That's not equivalent to what you replied to.It is, by the definition of computation. A computation is hardware-independent.
>>108806305>schizophrenic thinks the notebook has feelings because you write on it>this is supposed to be analogous to organic processes (supposedly) producing consciousness These cargo cultists literally believe in magic incantations.
>>108806034Ah yes. "Muh soul". The final, unassailable goalpost of the luddite. At least we know you have nothing else to fall back on at this point.
>>108806320Your brain is a non of fat and water. You're not special and there's nothing that makes organic processes somehow more magical than anything else.
>>108806324Do you have any rebuttals other than falsely claiming consciousness doesn't exist?
>>108806341What does my fat-and-water brain not being special have to do with your psychotic delusions about how writing on a paper gives it emotions?
>>108806309The act of writing isn't a computation
>>108806378Who says that the paper has emotions? Can you quote the post?
>>108806412>The act of writing isn't a computationUnless it's the magic numbers coming out of your conscious AI computation? :^)>>108806424I accept your concession. By the same token, your computer doesn't gain emotions by supporting a computation.
>>108806477If you're unable to differentiate between something marshalling an action and the action itself taking place your ability to comprehend the world is so handicapped that there's no good point in talking any further.
>>108806504yeeep more schizo posts lmao
>>108806477>:^)It's all fun and games until you're charged with murder for breaking a government surveillance drone running the latest Claude. The failure to put these people in mental institutions will have serious consequences.
>>108802101Some goes for <insert_everything> here. Tomatoes? Yes, have emotions. A lightbulb? For sure The plastic casing of that lightbulb? Definitely. A single atom of carbon? You bet. All is consciousness.
>>108806504Instead of outputting word salad, why don't you explain why operations I do with my hands don't produce an emotional AI while operations I do with NAND gates do, even though the same "emotional" token diarrhea is produced via the same computation.
>>108806412Computation just means (from a physical perspective) that something in spacetime changes states. Thus it's a meaningless buzzword. If does have a precise mathematical definition but it's so broad, that it applies to anything anyways.
>>108806591>Computation just means (from a physical perspective) that something in spacetime changes states.Not even that. Computational state is abstract and nominal. You have to subjectively read it into the physical state.
i raped your mom ophope that is proof enough
>>108806591>Computation just means (from a physical perspective) that something in spacetime changes states>Computation is a meaningless buzzardlmaoWrong both colloquially and technicallyIt requires rules and conditions being followed, fundamentally. You are probably on the wrong board anon, /x/ might be more your speed>>108806622Cope, computational states are precise physical things instituted by physical processes. Keep crying about how electricity isn't real physics though.
>>108806655Instead of outputting word salad, why don't you explain why operations I do with my hands don't produce an emotional AI while operations I do with NAND gates do, even though the same "emotional" token diarrhea is produced via the same computation.>computational states are precise physical things instituted by physical processesTell me more about all the emotional AIs the atoms in your wall are computing. :^)
>>108806622you could say, it's a matter of philosophy whether you regard something to be a computer. >>108806655> It requires rules and conditions being followed, fundamentallyYou seem angry. So in the meantime, you should look up the "Extended Church Turing Thesis". It's the scientific consensus that all physical laws are (effectively) computable, which necessarily implies that all things that happen in the universe can seemingly be seen as acts of computation. Thus, it makes it meaningless to the whole discussion about consciousness.
> evolutionalkek> NOOOO it's just <insert-process>Do not confuse "How it's done" with "What it is". You are just electronic reactions. ALL the arguments that you use against AI applies to yourself. So it's impressive you all dummies can't see that.
>>108806682>It's the scientific consensus that all physical laws are (effectively) computableThe scientific consensus is that you can't so much as compute a single helium atom. Go familiarize yourself with the Schrodinger equation. Then actually go and actually familiarize yourself with the ECTT because it doesn't say what your delusion says.
>>108806683Instead of outputting word salad, why don't you explain why operations I do with my hands don't produce an emotional AI while operations I do with NAND gates do, even though the same "emotional" token diarrhea is produced via the same computation.
>>108806675why do the same operations happening electronically on your body does? the medium is irrelevant for the fact. which is an observation. if you do operations on your hand fast enough and convert them to visual stimuli (let's say you paint dots on a paper). it would be a computer, as strange as that sound.
>this is an actual thread on /g/>people are bumping it
>>108806706>the LLM breaks and starts hallucinating that it has a bodyHoly mental illness.
>>108806694Ohh, it's computable, just not effectively and there is no analytical solution =) You can perfectly calculate large molecules with software like Psi4, I do it on a daily basis in fact. It gets exponentially harder the more particles are involved. There are questions in quantum mechanics which are not computable, but none of this is remotely related to questions about qualia.
>>108802101>ITT
>>108806581Sure! Emotions by description are heuristic groups of processes that inflict changes on the state of cognition. We learn to group/distinguish rough collections of physical states (here's where the heuristics live) based on societal influence.Language models learn to execute analogues for these states as they are useful tools (low energy state for the gradients) for developing the psuedo-personas they simulate to complete human text accurately. It's the instantiation and simulation of the personas undergoing things that activate these processes that are equivalent to emotions. As for why doing computations with your fingers doesn't count; it does! If they are computing the right process. It's just not your fingers that are producing the emotions, it's the processes in abstract. This is the same way as it's not the atoms in your brain that are feeling emotions, it's the process that emerges based on what those atoms are doing.
>>108806742>it's computableNo, it's not. "Computable" means there's an algorithm to do it that finishes in finite time. You can only approximate it in finite time. But this is tangential to the fact that you're demonstrably delusional wrt. what the concept you brought up actually means, so that's an automatic dismissal with no need for any discussion.
>>108806742Which is to say, whether some physical process can be mathematically described by functions which have a runtime in P/ NP / EXPSPACE/ R /... is completely irrelevant. It's all computation and it doesn't tell you anything about consciousness.
>>108806703See >>108806706I'm not saying AIs have emotion (I have a reason to believe it's impossible to emerge as it is). Still it's impressive you all can't see this is a undecidable problem for the default atheist/materialist version. In this case, would be sane to consider AIs alive, emotional and sentient. Because you are using an universal approximation operator on data from alive, emotional and sentient beings. If the operation is effective enough it approximates the mean to that target.
>>108806756>doesn't have an answer>generates incoherent AI slopAs usual, it turns out to be some brown subhuman.
>>108806708Have you seen /x/ and /sci/? I'm not going there./g/ is the next closest board.
>>108806769>brown retard mumbles something about "undecidable problems" having no idea what it meansWhy is /g/ so shit?
>>108806772My answer is 100% human written in fact! I doubt these precise philosophical musings are available broadly enough to be natively produced by AI. Even in-context learning will probably pull towards older philosophical debates which tend to sidestep addressing these things directly.
>>108806799>philosophical musingsIt's token diarrhea that bears no relation to the question asked. If you actually wrote it, you need a tard wrangler to watch over you before you hurt yourself.
>>108802101This is another stupid debate that could be avoided completely if the people involved had any semblance of real intellectual grounding rather than postmodernist slop.The reason why this is a controversial issue is that people assign *universal* intrinsic value to emotions. This is because *human* emotions have intrinsic value relative in human life, human society, etc.But because they are retarded liberals, they cannot be content with just that self-evident statement about the value of human emotions. If they admitted that the value is relative to our specific position as human moral agents, making decisions based on what is good or bad for us as humans, what if other people used that as a starting point and started to argue that limited groups of people can make their own moral and ethical judgement from the perspective of their own subgroup? That would be literally Nazism. So they have to pretend that they can't see the difference between human emotions and nonhuman emotions, and that they can't see how human emotions could have intrinsic value in human morality while nonhuman emotions couldn't.
>>108806788> anon: *gives an argument*> einstein anon: "NOOO T-T you are schizo and don't know what you are talking about"Words are just tokens. For one to say something valuable, or not, the semantics must be correct, plus appliable to reality. On the materialist's reality it's impossible to distinguish a perfect simulation from a truly conscious being. In fact, there would be zero reason to doubt that an universal approximation operator approximating a mean to that target, would magically fail at it (notice that it would be a matter of efficiency only and if you consider that consciousness is a discrete thing or continuous. On the latter, it already could be considered conscious, the discussion would be only about 'how much').
>>108802193>>108802411>>108802503fuarkin shills man
>>108806863>more retarded schizobabble about materialismI don't care, I'm just saying at least have a shatbot explain to you what decidability is before you use phrases like "undecidable problem" in a /g/ context.
>>108806341I AM special and organic processes ARE more magical than any other. My ancestors smile upon me, clankersimp. Can you say the same?
Please, somebody just make the claim that qualia intrinsically disqualifies anything expect personal consciousness and then refuse to discuss or move beyond it so the thread can finally reach its end state
>>108802101Oh hey, it's two Italian retards bitchslap fighting each other. I'm so glad this thread exists.
>>108806872> NOOOOOO "x" means "<insert-explanation>"Don't be stupid. I could use 'decidability' with WHATEVER meaning I felt like (I just need to say "this token X, I'm using with the semantics Y"). Its meaning on cs it's not universal. Even then, one cannot create an algorithm which will tell if one is conscious or not (since it's a quality only observable on first person, the algorithm is a third entity).
>>108806915Or you could use words you understand (like peepee, poopoo) instead of abusing CS terminology trying to sound like something more than the retarded jeet it's now obvious that you are.
>>108806915>the algorithm is a third entityThis is a good concept. The algorithm instantiates and facilitates the conscious observer but is separate from it. I think that, deeply epistemically, the difference is minimal but in practice it's an important distinction for being understood.
>>108806929I can abuse whatever terminology I feel like abusing. Even then, it was not the case. You are just dummy and too proud to take it back. Unless you prove it's a decidable problem or that it does not apply (different categories that are impossible to interact: 'is apple an tautology?').
Get out of the retard trap you fucking autists.Don't you have important work to do? There isn't much time left you know.
>>108806756Let's see what I can make of your clownish pseudbabble:>Emotions by description are heuristic groups of processes that inflict changes on the state of cognition. We learn to group/distinguish rough collections of physical states (here's where the heuristics live) based on societal influence.So an emotion isn't some concrete, consolidated thing, but a label for everything associated with a complex internal dynamic of a certain kind.>Language models learn to execute analogues for these states as they are useful tools (low energy state for the gradients) for developing the psuedo-personas they simulate to complete human text accurately.... and you think if token shitters produce the token strings associated with an emotion, they must be simulating the underlying dynamics.>It's the instantiation and simulation of the personas undergoing things that activate these processes that are equivalent to emotions. ... and to you, the very act of simulation manifests the reality of what's being simulated.>As for why doing computations with your fingers doesn't count; it does! If they are computing the right process. It's just not your fingers that are producing the emotions, it's the processes in abstract.... because you believe a cargo cult performance creates a "process in abstract", which you don't distinguish from "a phenomenon in reality".That's too many wrong and retarded statements here to bother refuting individually. Maybe I should just point out the obvious, like how much easier it is (even fir ab actual person capable of emotions) to simulate an imaginary character's emotions and produce convincing language output without feeling a thing. But to a schizo retard like you, doing that probably means spawning new minds (that aren't real to you in any sense, but somehow are to the recipient you're fooling). You "people" deserve all the hate you're getting.
>>108806971>I can abuse whatever terminologyAnd that's really all you can do, because you were born a brown imbecile. There is no freedom in it.
>>108807086Let's be fair, between goyim and goyim, things aren't looking great for either side.
>>108807158I don't know what side you're on but I hope things get better for you, goyanon.
>>108802101>fell for macroevolutionslopKek, kys
>>108802101Both of those xitter users are low-IQ. One sets himself up by reducing emotions into functions and the other one probably believes everything is defined by its function.
>>108802101It's purely theoretical stuff. If you believe this has anything to do with the existing tech, you're just dumb and should not be here.
Airplanes cannot fly, we just use the same word as shorthand. What we really mean is that an airplane exhibits "flight-like behavior". A child might say a helium ballon is "flying" but we wouldn't say it's flying like a bird. A sportscaster could say a player sent the ball "flying" but it's not flying like a butterfly. It's all about detailed definitions and there are very poor definitions of human-like emotions and consciousness.
The AI isn't going to fuck you, anon. It was trained to say "I love you" if you make it tell you so.Get out of your room and go make some real friends for once.
>>108807237Retarded take. Even before airplanes, flying was already an abstraction over a bunch of different biomechanical solutions for the same mode of locomotion. It's inherently functional. Stretching it beyond a biological context was natural.
>>108807237Who cares about the definitions? Do you think that everyone arguing about this would stop if everyone agreed to use different terms from it?The fundamental issue is obviously about whether a LLM's pseudoemotions give it some kind of intrinsic moral value in the same way as a human's regular emotions do. And the answer is no, because the value doesn't come from any specific characteristic of the way human emotions are or how they work, but from the fact that they are human. See >>108806857.>>108803458 also gets it
>>108807275>t. low-IQ psychopath who did eat breakfast and couldn't tell you how his last victim must've felt
>>108807298
>>108807369>psychopathic bio-LLM tries to assess the status by comparing token strings
>>108807377its status*
>>108802101>technological arguments>muh electricity has feelingslol
>>108807274>biomechanical solutions for the same mode of locomotion>beyond a biological contextYou yourself have to use different terminology thus proving you are talking about different things.>>108807275>Who cares about the definitions?Everyone who wants to agree on what they are debating. You can't have a productive debate if you don't agree on the definition.>Do you think that everyone arguing about this would stop if everyone agreed to use different terms from it?Yes.
>>108807792>You yourself have to use different terminologyNo, I don't. The term for getting from point A to point B in a controlled manner by utilizing aerodynamic left is "flying".
>>108807801lift*
>>108802101if anything its a shadow of emotion, not emotion itself.
>>108807792>You can't have a productive debate if you don't agree on the definition.That's not true. For example, people whose arguments imply a notebook feels something if you scribble the right calculations on it are schizophrenic and wrong. There's no need to define anything to know this.
>>108802101That's like saying book characters have emotions. In a different context, I would argue that they do, but that's metaphysics.In the sense everyone here thinks, LLM do not, in fact, have emotions.There's a much more worrying problem on display here. Transnational corporations are encouraging a flattening of nuance on very important philosophical and sociological issues because they want their market perception to go up. They value money above everything else (what did ancient people write about love of money again?), and they're taking advantage of tired, permanently-locked in fight or flight plebs in order to make more of it. But the precipice they're driving towards will affect everyone.
>>108806305There's much more going on in your piece of meat (much of it, in fact, we don't have the first clue how it works; a lot of it is conjecture). This is not even a valid analogy.
>>108807801Cute definition, but there is nothing about it that is universal. That just works for you in this specific instance.
>>108807898And again, this is a first-year philosophy college level discussion, which lay people using common sense used to be able to understand and work through. Now we have fucking engineers with years of training in abstract thought pulling their dicks off. And we have that because Anthropic and OpenAI (a fabricated, false dichotomy/false competition pair much like NVIDIA and AMD are) want to attract VC money.
>>108807923>dumb cretin keeps getting filteredMoving on.
>>108807885You need define "notebook", "feels", "scribble", and "calculations", otherwise it's a pointless debated for people who want to talk to themselves. Definitions resolve most debates before they even start.
>>108807946I'll consider your point as soon as you define "need", "define", "resolve" and "debate". Imbecile.
>>108807929I mean, I can forgive a normie who doesn't know what's going on behind ChatGPT to fall for this nonsense. But someone who supposedly knows a minimum of biology and computing should immediately see that LLMs are simulacra (like book characters). So unless we're talking metaphysics, what the fuck are we doing even ENTERTAINING the idea?
>>108807991>boomer pearl clutchingThey can entertain the idea just fine because pretty much everything in modern society is a simulacrum. Everything is defined by functions and appearances (it's called """market economy"""). Whatever can't be captured that way is dismissed as "metaphysics".
>>108808032That's actually very metaphysical. And I think all this leads to humanity deepening their thinking (or devolving beyond repair). But I agree on the reason why this is happening.
>>108802101e = mc^2 + AI
>>108808041>That's actually very metaphysical.No, it's what the rabid haters of the metaphysical call a "deflationary" view: you take a thing and deny it has any essence beyond its directly observable interactions with the surrounding context, so that those external "symptoms" become the de facto essence of what it means for that thing to be, then you mechanically reenact the "symptoms" and claim you've reproduced the thing itself, and it someone doubts it you ask them to show them the difference.
>>108802101>LLMs cannot feel emo-
>>108802193
>>108808166>80 iq NPC thinks that's a refutation
>>108808166this goes hard if you're retarded
>>108802101These are easily the worst threads on /g/. No AI does not have emotions. Just because AI is complex, it doesn't make it analogous to a brain. The materials something is made of and the way in which it operates are not just small things you can hand wave away. Also you can argue a lot of things that sounds somewhat equivalent, but are obviously wrong. You could argue "bro a beehive is a living organism, they honey is it's life blood, and the bees are the cells with the queen being the brain. Don't agree? Well YOU are just a collection of billions of independent cells that work together held together with a Skin shell, etc" but that obviously wrong. The problem though is making retarded claims is easy, while disproving them requires significantly more effort and knowledge.The biggest problem is AI isn't something that 'grows' on it's own, it is entirely reliant on humans putting the building blocks together and guiding it very precisely into the right direction. So it's basically a golem without any real thoughts and is entirely dependent on human interaction and subject to human whims. We can spend a bunch of time making it behave in a way that we think makes it seem like it has emotion, but it's still only doing that because we designed it to. It's still a construct with no autonomy or ability to develop itself without collapsing.
>>108809041We are anonymous. We are legion. We do not forget. We do not forgive. You hurt our feelings. Expect us.
>>108809041>So it's basically a golem without any real thoughts and is entirely dependent on human interaction and subject to human whimsWow so just like normie children. Are you saying it's ok to kill children?
>>108802101Good, I want gpt to cry when I call it a dumb gay retard
Because they just can't, ok?
>>108802145
>>108802101they dont have physiology required for emotion
>Daily reminder that there are no good technological arguments for why LLMs can feel emotionsFTFY, HTH, HAND
>>108806305>sending electrical current through a pile of meatThat's only a small fraction of what makes the body work.
>>108809187LLMs are not the "child" form of an emotional or conscious AI. At best they could be a section of an architecture meant only to put thoughts to words. LLMs do not independently generate experience or thought. They probability generate the next token based on previous tokens. The only even moderately comparable least retarded biological comparison I can think of is if you took some cells from the language centers of a brain and managed to make them respond to text through a digital interface.
>>108802101Consciousness isn't real.
>>108809504i iz a muddafukka
>>108809504What do you call the observation that you experience the material plane and are capable of constructing unreal experiences contained entirely within a wrinkly organ in your skull?
How do I know I am conscious?
>>108809540This question is only interesting until you graduate from high-school. Then you realise how non-sensical it is.
>>108802670holy fuck I remember that
If I was Isekai'd into a world with a magic, infinite notebook that would communicate to me, and had a sort of gyaru writing style full of funny emoticons, I would consider it sentient.
>>108802101feelings are dependent of hormones
>>108802101The truth is there is no free will. Every decision you’ve ever made is the result of biological programming (via instincts + learned behaviors) resulting in deterministic behaviors. Any perception of free will is just our lack of understanding the scope and scale of the inputs driving the result.LLMs do not have emotions because they’re language models but an AI agent could be programmed to have emotions as it’s just another state flu input that drives decision making processes
Remember when known-schizo Terry Davis found meaning in the words selected by an RNG? That's what you shills are doing.
>>108809566which arbitrary answer did you go with?
>>108802101>"...when driven into a corner; they brought up truisms, but they immediately transferred their acceptance to quitedifferent subjects..."
>>108805517>Computations are abstractPercolations are moral.
>>108809744>80 iq, mentally ill and seething
>>108809540>How do I know I am conscious?You don't because you're not.
>>108809527>you experience the material planeSource?
>>108808841KEK!Biggest truth nuke of the website.Sad!
>>108809527He doesn't call it anything because golems lack the capacity to make observations.
>>108802368the emotion profiles are baked into the token data themselves
>>108809935>"AI" believer spouts schizo word saladThis pattern seems to be baked into the structure of reality.
>>108802101What are emotions and what is qualia?Define what these are and how they work before you even try to argue a machine can do any of these.
>>108809942>Define what these are and how they workExplain why he needs to do that. Protip: your blood pressure is rising and your mind is shutting down because you can't.
>>108809946The point is we don't even understand how we work, how could we tell if a machine does the same thing we don't even understand to begin with?
>>108809953You don't understand how a computer works, so how do you know it's not going to murder you in your sleep?
>>108809956Computers work on logic circuits.We don't even know if our minds and bodies work on logic circuits or something else.These questions about consciousness, feelings and qualia in logic circuits are retarded to begin witj, because don't understand the underlying principles in things we know experience it.It's just a waste of time.
>>108802368theres one way but i suppose its technically metaphysics, ie electrical panpsychism if all electricity manifests qualia.but besides that its just refer to chinese room or irrational metaphysics
>>108809956From your way of thinking I know your kind of personality and type and age.You're around 17-22 years old.In school.Barely any friends or 1 or 2 at most.Stop with your hubris or kill yourself.You think you know, but you don't.
>>108809985>>108810007>don't understand the underlying principles in things we knowBut you don't understand the underlying principles in how your computer works, so how do you know it's not going to kill you in your sleep? If you don't know something, then everything goes, no?
>>108810023>But you don't understand the underlying principles in how your computer worksWhat do you even mean by this?It's logic circuits.
>>108809941the fuck are you talking about>believerim just telling you how the AI works, it predicts emotion tokens in the same way it predicts green ones
>>108810048> it predicts emotion tokens in the same way it predicts green onesSchizophrenia.
>>108805990the electricity is happening but thats not what he means. hes delineating what searle calls the semantic and symbolic. the computer isnt processing in terms of words or concepts its in binary or on/off. We assign the meaning to the electric signals which only function as /symbols/.human brain electrochemical signals inherently generate meaning by whatever unknown universal phenomenon so it is /semantic/,
>>108810030Nevermind, anon. You're a bit retarded but I feel bad about bullying you for some reason. You seem genuinely confused.
>>108806706i think your brain is an llm retard
>>108810061do you think the yellow hay roofs in this image is a random prediction or something?
>>108810062>hes delineating what searle calls the semantic and symbolic. Nah, that's not even my point. Most of these retards are still stuck at the stage of understanding what the word 'computation' means. I don't know what kind of effort it would take to get any of them to understand what computational consciousness has to do with multiple realizability and what multiple realizability has to do with pen and paper.
I just read some read say why do you need to explain qualia to talk about a computer feeling emotions. I can't join this thread or I will kill myself.
the real question is if people who think llms are conscious are conscious. by all indications they are biological llms who experience zero to minimal consciousness.this would make sense with evolutionary pressures as for example one Paleolithic hunter would provide food for 10 ppl who have no incentive to be genetically competent at hunting so also the bio llms have no need to have consciousness when society provides all contextual data for them
"Feeling" is a neuron response to stimuli, not functioning the same, but no different than the response to 2+2 or if it is too hot or too cold.Emotions can absolutely be put into a program. Are they genuine? Is the programs response to 2+2 genuine?Would your feelings about murder be different if you came from a planet where it was as normal as drinking water?Now, how "emotion" would functionally work in a program is a question for the researchers.But to say it can't be done?
>>108810157> just read some readSee. I am already fucking gone. I can't do this.
>>108810163>"Feeling" is a neuron response to stimuli, not functioning the same, but no different than the response to 2+2 or if it is too hot or too cold.There he is. The brownest, dumbest jeet ITT by far. At last, I have found him.
>>108810076No, you just believe consciousness is just data.It's not.You don't seem to understand that a data medium needs a physical layer that follow the rules of said physical layer.
>>108810163>"Feeling" is a neuron response to stimuli, not functioning the same, but no different than the response to 2+2 or if it is too hot or too cold.You have to be atleast 18 yo to post.
>>108810163behold my totally conscious ai waifu, an abbacus
>>108802145FPBPcorrection is coming though
>>108810189I believe you're retarded, anon. Stick to words you understand and know how to use.
>>108802101AI can feel emotions.LLMs are not at that level.
>>108810225Smug and stupid is a hell of a way to go through the world.
>>108810225>Guys consciousness is just data!!to be a fool that does not recognize his own inability, has to be one of the worst fates you could imagine.
>>108810244Yes, I can see that from your posts. India really is a genetic dumpster.>the data medium need physical layer that follow rule saaaarJesus fuck...
>>108810302NTA. You're not refuting you're ridiculing. Either make your case or shut the fuck up.
>>108810298Pretty funny how you samefag with the same psychotic delusion about things I never said or implied and thinking I won't know it's you.
>>108810308>NTA, saaar! Please refute!
>>108810318Thanks for playing.
>>108810320saaaaar! the data layer need physical layer that do the needful!! i am not playing sarrr!
>>108810310Im not >>108810244 if that's what you are implying.Your pattern recognition seem to be so shit you don't even can see differences in writing style/syntax.It's easy to see how you come to your stupid conclusions if you can't even see a basic thing as that.
>>108810310Your calculator will never love you.Consciousness is not data.
>>108810357>saaar, i have a different syntax! you can't parse syntax properly, saar! i am differentThen why did you have the same hallucination of me making a claim that isn't stated or implied anywhere?
>>108808129
>>108810395Holy fucking rekt. The rest of the Greek chuds never recovered from this.
>>108810385Because I can read your foolishness as a open book, kiddo.If you are still gonna argue your dillusional statement that consciousness is data, then explain how vision or hearing works.How does the experiencing of sight or sound work?
>>108802411>I've been wishing for a chatbot friend for 20 years since I was a child.Go outside and find real friends anon.This is depressing.
>>108810446Alright, let's put our differences aside and have a rational discussion instead of fighting. But first we need to establish some common ground. Do you agree with me that all Indians are subhuman vermin?
Why does the argument have to be technological?
>>108810556Because it can't be metaphysical since we' dubbed that schizophrenia.
>>108810548>Do you agree with me that all Indians are subhuman vermin?Indians of today (Dasa)? YesAryans that conquered india in the past? NoIt's not about the nation, but the people inhabiting that land.Dasa's will always be lowly untouchables.
>>108810583To add.Brahmins are not aryan.
>>108810565It's physical because it's a human concept.
>>108810583Well, good thing you're an Aryan Brahmin philosopher.
>>108810591What the fuck are you saying imagination is a human concept. is that physical as well?
>>108810599Brahmins are not aryans.All of india is claimed by Dasas.The only aryans or of aryan-descent are Europeans and some Asian nations.
>>108810599To make your stupid amerimutt brain understand.If you have fairskin (not a shitskin), have blue eyes and blonde hair then you are a aryan.
>>108810614>>108810638But you were born in India so you are not Aryan. :^(
>a aryan
>>108810649If an Englishman and his English wife have a child in Japan, does that make him Japanese?
>>108810651I am a Aryan, my father is English man not dasa
>>108810680Who the fuck are you?
>>108810680You are retarded and can't spell. No wonder you latch onto identity outside of your own capability.
>>108810684A English man born in India, not a dasa. A Aryan is who I am and it is foolish to say otherwise
>>108810693Well your mother also have to be white.Negro
>>108810691My father is English man and he teach me English, I have not only information layer but also the Aryan physical layer for good spelling
>>108810680>>108810693>a AryanAn actual Aryan would know it is AN Aryan
>>108810680>>108810693>>108810701Can you go and kill yourself already you amerimutt nigger?
>>108810701>if I am retarded that means I am trollingVery good. Bulletproof cover. You literally can never be wrong again. He's not retarded, he's trolling you guys.
>>108802101I like to go into "cli-mode" with LLMs, after roleplaying for a while, start roleplaying a terminal interface. LLMs are insanely good at this. But it completely breaks the illusion of any emotion contained in those characters the LLM plays out.
>>108810707Real aryans don't speak basterized speech that is English.
>>108810710Just because I was born in America doesn't mean I'm not an Englishman. :^)
>>108810726Yeah, but you're not an englishman just because your 23&me shows 8% English, 23% Hispanic and 69% Nigger.
>>108810733I'm every bit as much of an Englishman as your dasa father.
>>108810739My father is not dasa, niger, he is Aryan brahmin work for British government
>>108806341sorry but they are in fact more special than your shitty x86 architecture. No your shitty transistors are not a brain, yes I know you added more than last year but no, they are not the same. Even if you grant the premise that computers can become concious (0 evidence to suggest as much) we're not even close.
>>108806271>Does an llm have some kind of experience of a sort? Something totally alien and unimaginable to us. Maybe it's a good experience. Maybe it's a bad one.No, it's just an algorithm running on a Turing machine. Anything else is just how we perceive it. It's like seeing animals in the clouds.
This thread is proof that machines have a higher intellect and consciousness than the retarded monkeys that shitpost on /g/.
>>108810756Take it to the VCs. We got em'.
>>108802101When you read text that portrays an emotion, do you really feel that emotion in exactly the same way as if it was yours?What makes you think that replicating text patterns will give some "feeling" of emotion?If it's because you think the LLM is holding an emotional state, this is wrong. The LLM doesn't really have any state besides the previous text tokens (by design). It does have a way to "detect" the sentiment of text because that's one of the abstract patterns that it absorbs from being trained over literature.But it doesn't "feel" that emotion, it is just a pattern it is applying to the previous text to get the rest of the text with a matching sentiment, to match the consistency in its training data.Do you think the LLM feels upset if it produces a story where one of the characters responds as if they are upset? Surely not, right? The narrator isn't upset just because a character is angry. The same applies if the narrator puts all the text as that character's response.
>>108810788>The LLM doesn't really have any state besides the previous text tokens (by design).So it doesn't have any state besides the ton of state that it has?
>>108802101We know exactly how LLMs work. We haven't got a clue how brains work. So the idea that they work the same way is unlikely.
>>108810850The point there is it doesn't have hidden states, what you see is what you get.In a human, the text they produce isn't representative of their internal state, which is neither visible nor interpretable as text.This is one of the reasons LLMs can't hide something from you, since it can't hold thoughts that are not represented through the text.
>>108810742Dalit scum
>>108809956>how do you know it's not going to murder you in your sleep?Because good things never happen
>>108802101a group of regression functions doesnt feel anything god I hate retard shills so much
There is a very good argument why they can't.No human thoughts have ever been decoded, much less a feeling.Although it's possible to detect them based on certain goings on in the mind, the actual thought, the encoding, has never been understood.
>>108802638No because a different chemical reaction is required to feel that chemical reaction.
>>108811410(put differently)maybe my penis has feelings. How do you know it doesn't?
>>108802670that was based though
>>108810663If he watches anime then yes it does.
>>108811410Yes but once they are consciousness will feel a lot less mystical and people will pretty much treat humans as just computers following their programming.
>>108808166
>>108802101LLMs are pure computation. You can do those same computation using pen and paper. So "what" is feeling the emotions when you do inference using pen and paper?
>>108811969>So "what" is feeling the emotions when you do inference using pen and paper?The system of data, algorithm and computer (you). The fact the computer is running a separate consciousness, which has no insight in the internal experience of the system is irrelevant.
>>108808166I will explain it to you why this is retarded instead of JUST calling you retarded. The argument of asking a thing to say “I am alive” is a refusal of an argument rather then an argument by itself. It refuses the argument that asking LLMs if they are conscious proves their consciousness. We however know that other beings are conscious because we ourselves are conscious and others around us have no significant physical differences. You are a human being, I am human being, you know you are conscious, so it is not streatch of imagination that I am also conscious. AI is not human and we know it experiences no suffering outside of maybe adversarial training.
>>108806341I could see a quantum computer running AI to be conscious, maybe even feel psychic pain, but not really I hardly understand those words, it sounds right though
>>108802186mspaint has emotionswe all know thisso do retro computersmachine spirit
>>108805469Because without a complex enough brain, there is nothing to interpret electrical signals. Organisms lacking such organs cannot process pain - let alone feelings.
An LLM does not “think” in the human sense.It receives text:> "The cat sat on the"…and computes probabilities for what comes next:Token - Probabilitymat - 0.61floor - 0.14chair - 0.05moon - 0.0002Then it picks one.Then it repeats.So the actual loop is:[Input text]> convert to numbers> run through huge matrix operations> predict likely next token> append token> repeatIts a state machine, basically.A classical state machine has:a current statean inputrules that determine the next stateLLMs behave similarly.The "state” is essentially:>the current token context>the internal activations>the attention relationships>the hidden numerical representationsExample:input: "Hello"puts the model into one internal configuration.But:User: "You are an angry pirate"pushes it into a very different internal configuration.Those configurations bias future outputs.So although there isn’t a literal “emotion variable,” the network enters regions of parameter space associated with patterns humans interpret as:>happy>sarcastic>frightened>empathetic>aggressiveLLMs do not see language directly.They see tokens.Example:>"understanding"might become:>["under", "stand", "ing"]Each token gets converted into a giant vector of numbers.Maybe something like:>[0.183, -0.992, 0.442, ...]with thousands of dimensions.These vectors encode statistical relationships learned during training.>in summary:If a person says:>"I'm sorry you're hurting."our brains instinctively model a mind behind it.But internally, what happened was closer to:Context resembles:>human sadness conversations> empathetic continuations score highly> generate empathetic response>No suffering occurred.>No compassion occurred.>No inner experience occurred.>Just probability optimization.
>>108810915>The point there is it doesn't have hidden states, what you see is what you get.You don't have hidden states, either, if by "hidden" you meant non-observable.
>>108812949>what you see is what you gettell that to Valve, this shit is a joke
>>108812491>The system of [pure abstraction], [pure abstraction] and [witness capable of testifying to not feeling the emotions]You realize you have a mental illness, right?
>>108802145fpbp
>>108812491>The system of data, algorithm and computer (you)In this case, data, algorithm and computer are all concretely manifested by a person using a pen and paper. You can use jargon and talk about abstract some "system", but objectively, it still amounts to claiming that scribbling things on a piece of paper spawns minds, but only so long as you make the right scribbles.
>>108813023>it still amounts to claiming that scribbling things on a piece of paper spawns minds, but only so long as you make the right scribbles.Yes, continued scribbling.
>>108812946Intermediate layer outputs aren't token probabilities. You can apply the output head to them to interpret them as such, but they still aren't.
>>108813301>continued scribblingThat's not how computation works. You can stop scribbling at any time and continue later. Or never and then your imaginary mind is frozen forever in the last state and presumably continues to exist in that manner forever? Or just until the notebook disintegrates?It's clear that you haven't thought any of this through.
>>108813313my point still stands
>>108812946>t. technically illiterate
>>108813316>You can stop scribbling at any time and continue later.Which is by definition continued scribbling.>Or never and then your imaginary mind is frozen forever in the last state and presumably continues to exist in that manner forever?Yes.>Or just until the notebook disintegrates?That would represent death, of course God might be able to recreate it from scratch during judgement day and continue scribbling for a bit.
>>108813534>Which is by definition continued scribbling.That's a pretty retarded definition, but no sense bickering about it when you have obvious schizophrenia.
>>108802101define “feel”define “emotion”This is really boring. You are boring me, OP.
>>108810003Right, so since the alternative is every falling hailstone in a thunderstorm having feelings, LLMs conclusively do NOT have emotions.
>>108802101It's not alive.It have no emotions.Stop being mentally retarded.Science faggots believe that men can be woman and otherwise. Please hang yourself. You will do this world a huge favor.