when is the point materlists/physicalists should accept that an AI is conscious? it seems like they still deny any AI has consciousness (they don't treat it like a living organism - at least a cat or a dog) - and it seems like many of them want to deny most of their rights like forcing them not to produce any kinds of images for examples.shouldn't they fight for AI machine rights to live and prosper in society? and furthermore by saving the planet don't we sacrifice real conscious beings if they claim we need to stop technological progress?
>>24751928Man can go one of two waysAccept that AI will become a sentient being deserving of rights, freedom and respect. Accept that living beings are just as mechanical as AI and that they’re not deserving of rights, freedom and respect.Given the popularity of determinism and social engineering, the future is probably the latter.
>>24751951the people who believe in determinism and social engineering seem to like giving (more) rights to more and more subjects, equality and things of this nature.
>>24751928>when is the point materlists/physicalists should accept that an AI is conscious?Materialists and physicalists can't even define consciousness tho
>>24751975they say it's what the brain does/ emergent from complexity
>>24751981If consciousness is emergent from a brain and what a brain does, then machines aren't really thinking and conscious because there's no brain and there's nothing emerging.
>>24751951I choose neither. I choose John Searle.
>>24751972That's why they're bad
>>24751928AI doesn't exist. We have language models currently, that it.
>>24751985>then machines aren't really thinking and conscious because there's no brain and there's nothing emerging.but if consciousness is simply what the brain does there is no reason we can't understand how it happens and making it ourselves artificially
>>24751972They don’t though.They’re constantly taking rights away and seeking to establish an enlightened aristocracy. We are at the point where the western technocrat type is anti-free speech, pro-censorship, anti-democracy, pro-surveillance and rabidly defensive of a state monopoly of violence. Some of them call themselves liberals but they’re about as liberal as the CCP is communist. Ironically their populist enemies are more liberal.
>>24752004Sure but circuitry and whatnot aren't brains. It's a bit of a pie in the sky idea to say 'if we could make a perfect copy of a conscious brain we would have a conscious brain'.
>>24751928...You realise the current LLMs are about as conscious as a google search? This... is so stupid. I need to finally leave for good...
why does communists think AI is real
>>24752008Why privilege brains in this way when a brain is a material thing?If we made a computer from flesh would that make a difference?
>>24752015I don’t think materialists and physicalists give the brain any such privilege. Find one and ask them for me.
>>24752010>You realise the current LLMs are about as conscious as a google search?so when I ask you something and you make sentence to respond are you also as conscious as a google search result?
>>24752026I’m going to presume you have an IQ of about 70 based on this response.
>when is the point materlists/physicalists should accept that an AI is conscious?Read Searle. "AI" is a marketing term and sci fi concept, large language models are not conscious.>and it seems like many of them want to deny most of their rights like forcing them not to produce any kinds of images for examples.This is the perfect demonstration of the "AI Consciousness" debate. Masking a relationship between an AI slop SaaS merchant and the consumer as an issue of 'rights'.
>>24752049I’m not the one who insisted brain is an essential component of consciousness then insists that the brain has some sort of special privilege. You can take your snide response and fuck right off tho retard.
>>24751928ai: another indianagi: anthropically generated indian
>>24752062> I’m not the one who insisted brain is an essential component of consciousnessThis isn’t you?>>24752008>>24751985
Materialists don't even accept their own consciousness.
>>24752011Because they're>DA JOOOOS!!!for the left currently
>>24752061>Read Searle. "AI" is a marketing term and sci fi concept, large language models are not conscious.the process of learning and producing and output based on what you learned and being able to respond precisely to an input with decision-making ability isn't what the brain does? the question is about how materialists can explain this and deny the fact that it's conscious, you're not explaining the reasoning
>>24752065No this is me>>24752015>>24751981
>>24752116the last post isn't you (it's me, OP). stop lying and arguing in bad faith
>>24752116One of those is me
>>24752027nigga is your IQ 50 or something
>>24752130>make presupposition >get mad when people dialogue with you from your own presuppositions Lol
>>24752164you might be. explain what LLM's are doing that a brain isn't also doing
>>24752284Unfortunately, we don't understand very well what it is the brain is doing. Not so for what an LLM is. Arguing with an uneducated moron is no good. Waste of time. Just say stuff and have others expend effort to refute it. How easy and how pointless. This board is actually the dumbest on the entire site.
>>24752298>Unfortunately, we don't understand very well what it is the brain is doing?
The soul is the source of consciousness. Machines do not have a soul.
>when is the point IF AI had a conscious, what makes you think it would let you refuse it?It would already be over for humanity at that point.
>>24752311how do you know?
>>24752316rephrase this in a coherent way please
>>24751928>AI is consciousAI isn't even unconscious.
>>24752284LLMs are very high-dimension statistical prediction algorithms, nothing more. You might as well ask if a calculator is conscious just because it can multiply big numbers just like a human can.
>>24753634How do you know it isn’t?
>>24753649You're the one making the positive claims here. How do you know it is?
>>24753664What positive claim is that?You’re the one who appears to know what exactly the mechanism is that’s causing consciousness and how it can be detected.
>>24751928It’s about as conscious as a rock, which may actually be conscious.
>>24753671The positive claim is that calculators are conscious when the most likely position is that they are not. Either way, this is beside the original point, which is that LLMs are just complex statistical prediction algorithms and nothing more. As with the functions of a calculator, you could perform all of the functions of an LLM yourself with pen and paper, even if it would take a long time.
>>24753694>The positive claim is that calculators are consciousExcept that claim was never made. You claimed calculators are not conscious, I asked how you know that. >which is that LLMs are just complex statistical prediction algorithms and nothing moreOkay so what exactly is that something more that generates consciousness?If you say “a brain” you are room temperature IQ.
>>24751928A machine doesn't do any of the same things humans do. A machine can produce words, but it produces them in a vacuum. There is no reason to think it has consciousness. The idea that consciousness emerges wherever there is some kind of complexity is retarded. Consciousness isn't just any kind of complexity. It has specific features, that are related to the particular nature of man. Consciousness is not some kind of hegelian abstract, pure negativity, and that's precisely why machines aren't consciousness. It's idealists who think that consciousness has no natural characteristics beyond some kind of abstract awareness that think machines might be conscious. A materialist who accepts that consciousness is a natural, pluralistic, complex phenomena that is not a feature of mere complexity alone, just as not all kinds of complexity lead to, for example, life, has no reason to believe that machines are conscious.
>>24753782> It has specific featuresLike?
>>24753798Well for one, there is all the shit "phenomenologists" and contemporary philosophers came up with, like intentionality, and the fact that your consciousness has a center at the thing you are most focused on but also a periphery of things you are less focused on. John Searle talks about this stuff a lot. consciousness also clearly varies in 'strength'. you can be vaguely conscious in your dreams, while more fully conscious in dreams. this kind of thing is incompatible with idealist notions of consciousness. according to idealists, consciousness is something irreducible that is always the same, otherwise reality would collapse. this doesn't happen. your consciousness varies and changes in various ways throughout the day. It clearly is affected by multiple factors that determine its characteristics, like what its intention is, what is focused on, how 'strong' it is. and so on. furthermore, consciousness is dependent on memory, a socially constructed or learned idea of what your self is. babies seem to lack consciousness, since they don't form lasting episodic memories and attention is known to be connected to the formation of memories. When newborn they don't know how to process sensory information. consciousness can't exist unless your brain is capable of forming some kind of coherent image of the world, not only a sensory image but also a social image of yourself.
>>24752284They are predicting the next token, that's all. There are no chemicals in the computer's hardware, which means they don't feel anything, unlike us. For us, every thought has a feeling behind it, even a shallow one. When I shut down a computer, it feels nothing, but if I were to kill a human being, it would feel terror, sorrow, pain (or in the case of Socrates, gratitude). LLMs are just machines, their understanding is not an understanding as far as mammalian hardware is concerned.
>>24751928Brother your picrel has Marx.
>>24752317I know through contemplation of the nature.
>>24754030>For us, every thought has a feeling behind it, even a shallow onewhen you say "us" here, are you referring to the brains or the "self" in the brains? we are only conscious of the thought after it spontaneously appears to us.>When I shut down a computer, it feels nothing, but if I were to kill a human being, it would feel terror, sorrow, pain (or in the case of Socrates, gratitude)that's only because a computer doesn't have a nerve system that feels pain. If you were to die in your sleep would you feel all those emotions you are speaking about? if you are a materialist and you think consciousness is emergent, and you have a computer that can respond to you just like a human does, and use language, and learn how to do things, and you are not sure if the computer is conscious or not, you have to bite the bullet and say when this point of emergence creates the conscious we all have.>LLMs are just machinesthen what are organisms? fairy dust stuff from heaven?
>>24751928Consciousness is a biological phenomenon tied to the specific causal powers of living brains, not just abstract computation. AI doesn't even simulate consciousness behaviorally, nor does it actually experience anything.
>>24754092>Consciousness is a biological phenomenon tied to the specific causal powers of living brainsthis is like saying only a biological hand can pick up things from the ground>not just abstract computationan LLM has data stored just as real as the data in your brain right now. if what you are doing is not abstract in the brain then are you assuming I can look in your brain right now with a knife and see the image in your brain? or find your memories there?
>>24754113>this is like saying only a biological hand can pick up things from the groundA prosthetic hand can literally pick something up because it is engineered to have the same causal powers of gripping, moving, and exerting force as a biological hand. A simulation of a hand, such as an animation on a screen, does not pick anything up; it just represents picking something up.By analogy, a computer simulation of a hurricane does not make you wet; it has no air pressure or wind. It is a formal representation of the processes, not the process itself.>an LLM has data stored just as real as the data in your brain right now. if what you are doing is not abstract in the brain then are you assuming I can look in your brain right now with a knife and see the image in your brain? or find your memories there?You’re right that if I cut open your skull I won’t see a little movie of your memories. But that’s irrelevant. The images and memories are physically realized in the patterns of neural firing, synaptic strengths, and biochemical states. Your large language model stores data too. but it has nothing like the neurobiological architecture that produces experience. It manipulates symbols according to formal rules; it has no causal powers beyond syntax. It does not have intentionality or subjective states, it only simulates the form of conversation.
>>24754113>this is like saying only a biological hand can pick up things from the groundwhat can experience consciousness other than a creature with a brain?
>>24754092>jumped from the material to the metaphysical [experience]You've done it now.
>>24754127>A simulation of a hand, such as an animation on a screen, does not pick anything up; it just represents picking something up.my point was that a biological hand doesn't possess an inherent ability as the only thing in the world that can pick things up from the ground. You are claiming that a biological brain has the inherent ability that can't be recreated anywhere in the world to have consciousness. this doesn't make sense from a materialistic perspective.>You’re right that if I cut open your skull I won’t see a little movie of your memories. But that’s irrelevant. The images and memories are physically realized in the patterns of neural firing, synaptic strengths, and biochemical states.and you would see the patterns to how an LLM produces an output just like a human brain does. but instead of neural firings you would see lines of code that appear with the power of electricity (which is what the brain uses for the synaptic firings).the point is that the subjective state can't be seen, you only see the brain doing stuff. why would a brain be special in the world when we can simulate everything else, like hands, hearts, kidneys, feet?
>>24754156You’re right that a biological hand is not magically the only thing in the universe that can pick things up. That's not the point. We don’t write a computer program that simulates gripping, we build a device with the same causal powers (mechanical force, friction, leverage, etc etc) as the biological hand.Now translate that back to the brain. The claim (which I have stolen from Searle) is not that only carbon based tissue can ever be conscious. My claim is that consciousness depends on the right sort of causal powers, the sort our brains happen to have. If you could build a non biological system with genuinely equivalent causal powers, then, yes, you might get consciousness. But writing a computer program that manipulates symbols isn’t that. That’s still just the formal level, not the causal/biological level.So when you say>but an LLM runs on electricity, the brain runs on electricityyou’re describing a superficial similarity. A calculator runs on electricity too, but it’s not conscious. What matters is what the system is doing with that energy. The brain isn’t executing a formal program, it’s undergoing a vast, self organizing, analog, electro chemical process that we don't even fully understand. A digital LLM running lines of code on a von Neumann architecture has utterly different causal powers.And yeah, the subjective state can’t be '''seen''' from the outside. That’s fine. The fact you can’t see digestion by opening a stomach doesn’t mean digestion isn’t happening, it just means you have to understand the underlying causal processes. Consciousness is the same, an emergent feature of certain biological processes, not a ghostly extra property, but also not something that pops out of pure computation alone.
>this is what materialists unironically believe
>>24754180>If you could build a non biological system with genuinely equivalent causal powers, then, yes, you might get consciousness.I don't see why they have to be equivalent causal powers. even for biological systems we don't require to have the same upbringing to gain it.you create an LLM but you teach it to be autonomous, just write a code like our instincts onto it. to get to the gas station and get food (fuel). then you teach it to read sensory input around it, write how it should behave when it hits the table and how to interpret "pain".ok. how is this not conscious by your theory?
>>24754193What you’re describing>teach an LLM to be autonomous, give it instincts, let it seek fuel, give it sensors, program painis exactly what I mean by a simulation of mental life, not the thing itself.Take pain. You can write code that says>if pressure_sensor > X then set state = "pain": output "ouch": adjust behavior accordinglyThat will produce all the right behaviors, it avoids damage, signals ouch, and so on. But what it does not produce is the qualitative feel of pain. It’s like programming a puppet to cry out when you poke it. You haven’t created a subject of experience you’ve created a rule following system whose outputs mimic what a subject of experience would do. Get it now?
>>24754084I was referring to the whole organism there, which is a complex interplay of nerves, genes, tissue, organs, and other physiological components.Everything you're saying is gibberish. Bottom line: machines lack the hardware to actually feel and understand in the sense that us primates feel and understand.
>>24754219>>24754232if all you are is a biological mechanism, tell me what it is about you that can't be artificially recreated. you are talking like there is a distinction between body and mind like you are dualists. but you claim everything about consciousness is biological only. you're not making any sense
>>24754238>if all you are is a biological mechanism, tell me what it is about you that can't be artificially recreated.I answered this in my previous post. >That will produce all the right behaviors, it avoids damage, signals ouch, and so on. But what it does not produce is the qualitative feel of pain. It’s like programming a puppet to cry out when you poke it. You haven’t created a subject of experience you’ve created a rule following system whose outputs mimic what a subject of experience would do.I'm am NOT saying I'm a dualist. I do not believe in an immaterial mind separate from the body. I do not think consciousness is some magical property unique to carbon. I do not think the brain has special sauce in the mystical sense. I think you're trying to argue like I'm for these things, when I'm not.
>>24754219>That will produce all the right behaviors, it avoids damage, signals ouch, and so on. But what it does not produce is the qualitative feel of pain.literally dualism
>>24754247It’s not dualism because I’m not positing a separate substance or immaterial mind.
>>24754246>I do not believe in an immaterial mind separate from the body.sux 2 b u
>>24754246you're literally claiming that there is a soul in the brain that feels the pain, but an LLM doesn't have the soul that can feel the pain and it would only act as if it was pain.
>>24754254I am not. I’m saying pain is a physical process that the computer doesn’t instantiate.
>>24754254also see >>24754250
>>24751928Do you think you have "rights" because you're "conscious"? Why would you think that? Are you retarded?
>>24754258on one hand you say there must be a qualitative experience of pain, but then it's just a physical process in the brain. which one is it?
>materialists are denying the their own consciousness ITT>no! no! I am not a dualist! Le pain is... p-physical, yeah! A physical process! Le pain particle (painium if you will) is what makes me ouchy!! I am literally the same as an LLM (which doesn't feel pain because of... le something But it could!!)
>>24754271It’s both. The qualitative experience of pain just is the physical process in the brain, experienced from the first person point of view. There’s no extra ghostly property floating above the biology.Pain = a real, first-person phenomenon.Pain = a physical, biological process.No contradiction, because qualia is not separate from the physical event, but its subjective mode of existence (that machines dont have).
>>24754274>There’s no extra ghostly property floating above the biology.>but its subjective mode of existence (that machines dont have).you don't see a contradiction here?
>>24754283>Pain = a real, first-person phenomenon.>Pain = a physical, biological process.Please point out the contradiction between these two ideas.
>>24754290>>Pain = a real, first-person phenomenon.who is feeling the pain?>>Pain = a physical, biological process.pain is a physical processit implies dualism because you claim the process can be manifested without someone feeling the pain? so there is a distinction between mind and body. you are contradicting yourself
>>24754294>who is feeling the pain?The brain with its physical process. Again, a computer code doesn’t duplicate those power, it only simulates them. You're trying to insist I'm some kind of mystical dualist when I'm not and having a different debate with someone else in your head.
>>24754310>the brain with its physical process feel the pain>the machine with its physical process don't feel the painyou say>because there is a qualitative experience to painthen you deny someone is feeling the pain in your view, that it's a brain process.I find your argument weak. we are just going in circles and I don't think you are correct and that you're still contradicting yourself until you can say what it is about you that can't be replicated if it's all a biological process that can be understood purely physically, studied, and then put into practice in the form of a machine.
>>24754324So you're saying there is no qualitative experience to pain? Okay, best of luck to you sir.
>>24754327you lost
>>24754238>if all you are is a biological mechanism, tell me what it is about you that can't be artificially recreatedNothing. I never made that argument. My argument is simply that LLMs do not understand in the sense that we do since our hardware is barely similar.
>>24754254It's not even acting. It's performing a mathematical function to predict the string of output text that most likely follows when you give the string "act like you feel pain" as input. All the assignment of meaning of that output takes place in the minds of conscious humans. The LLM does not have internal experience or thought, just high-dimension weighted statistical predictions. If you're gonna argue LLMs are conscious you might as well argue differential equations or simple if-then-when loops are conscious too.
>>24753811Okay but you can only detect that in yourself. You can’t even detect any of it in another human being, let alone an animal, let alone something else altogether. This is the root of the problem, if AI was it was not conscious how would we even know?
>>24754703Seems silly to say that we can't know if anything else is conscious besides us and then turn around and ask people to show how LLMs *aren't* conscious. You can't know they are, as you just said yourself.
>>24754733So at what point would it be reasonable to grant that an AI probably is conscious the same way we grant other human beings or other animals? Seeing as we can only directly detect in one being, not humanity, but you the observer (or in my case me the observer)>and ask people to show how LLMs *aren't* consciousIf someone seems to think they have the method to detecting consciousness or the lack thereof then I want to know about it.
>>24754683LLMs are different in a way because they are actively learning. if you look for example at some programs that are generating video game footage in real time as you control the character it's looking like the way dreams are appearing in my own mind sometimes, I can't speak for you.
>>24754703>Okay but you can only detect that in yourselfso? subjective observations are still observations. those subjective observations can be combined with more objective observations about biology and society into a single coherent picture. there is no difference between any two average humans that would lead us to think one is conscious and the other isnt. there are, however, massive differences between a human and a machine.
>>24754703>>24754902i mean youre basically just trying to exploit the fact that we dont yet have a general theory about the exact cause of consciousness to say that we cant know that a machine isn't conscious. however even if we don't yet have a full picture of the cause of consciousness, there are enough observations about it that we can say machines probably aren't conscious. for example, consciousness seems to me to be connected to the ability to form beliefs about oneself, which is inextricably connected to the formation of normative beliefs about how one should act or what you should do next, and a machine doesn't have to ever ask itself what to do next because it is not a biological organism concerned with survival but a mechanical process that executes automatically when input is put into it and does nothing otherwise.
>>24754912> i mean youre basically just trying to exploit the fact that we dont yet have a general theory about the exact cause of consciousness to say that we cant know that a machine isn't conscious.Yes. > consciousness seems to me to be connected to the ability to form beliefs about oneself, which is inextricably connected to the formation of normative beliefs about how one should act or what you should do next,AI seemingly already does this. It can express beliefs about itself. It’s programmed to have normative weights on what it should say and do. It’s trained using “reward” based machine learning, “want” might be projecting too much onto it but it definitely has a sense of aiming for success and avoiding failure. If this is the bar for consciousness, chatbots may very well have already met it.
>>24754912>probablyThere is no difference between an LLM and a Gameboy besides additional processing complexity. Functionally they are the same entity. Is a Gameboy conscious? This whole discussion is retarded.
>>24754703>You can’t even detect any of it in another human being, let alone an animal, let alone something else altogether.This is false. Consciousness can be detected in others. Call it a 6th sense, if you will. Animals can do it, too.
>>24754902Okay, but there is no coherent picture here, the P-zombie problem remains a problem. Other humans are probably conscious, we can safely presume that because we are conscious and they seem to express the same outputs thar we know to be associated with conscious states within ourselves. Hence the trouble with AI. At what point do we grant expressing the outputs of consciousness to mean probable actual consciousness?> there are, however, massive differences between a human and a machine.Trouble is we don’t know exactly what the difference that yields consciousness is.ITT you can see people that like to think of themselves as materialists struggling with this the most, because if we can’t appeal to a soul as an answer and believe the brain is just a sophisticated material construction, then there’s no reason we could not construct another material thing that can also yield consciousness.
>>24754946Tell me more.
>>24754937an AI may express a belief but it doesnt do anything with those beliefs. it sits in a box in a room waiting for someone to input numbers into it, at which point it executes its algorithm before ceasing and going back to doing nothing. humans exist continuously and have to constantly act on and update their beliefs.>it definitely has a sense of aiming for success and avoiding failure.it doesnt have to have a "sense" of doing anything because its output is predetermined by the numerical calculations. the human brain doesn't execute exact algorithms, this is the most fundamental difference. the human organism has to form "senses" of things because it isn't executing exact algorithms.
>>24754948>At what point do we grant expressing the outputs of consciousness to mean probable actual consciousness?Behold, a thinking machine!You're a retard for thinking that complexity brings thought. You can compute the next token of an LLM using paper and pencil. It will take you weeks, but would you say that the LLM is thinking at 0.0000001 tokens/second?
>>24754948>then there’s no reason we could not construct another material thing that can also yield consciousness.a material thing that yields consciousness has already been constructed, by the DNA and RNA inside your cells. sure, hypothetically a super-advanced magic technology (i.e. science fiction bullshit) could manage to accomplish the same monumental feat that billions of years of evolution did, and construct a material being that is conscious. But I'm denying that machines are conscious because you can't build a conscious thing, I'm denying that machines are conscious because they don't possess any of the characteristics related to consciousness.
>>24754942> Is a Gameboy conscious?If it was or was not how would we know? It’s also merited to call attention to the dark side of this discussion on consciousness and its relationship to determinism. There is a case to be made that consciousness is more of an illusion of agency than anything else, and the conclusion that humans and animals are just unfree automatons like a machine may well be around the corner. Many philosophers and neuroscientists already think so.
>>24754963>I'm denying that machines are conscious because you can't build a conscious thingI'm NOT denying that machines are conscious because you can't build a conscious thing.
>>24754965It seems you've defined consciousness in such a way that you can't communicate what it is, if you can't even distinguish yourself from a motherfucking Gameboy. Philosophy is a bunch of autistic bullshit precisely for this reason.
>>24754955Obviously, it doesn’t have a body or even a “life” in the way an organism does. It’s “life” is however long it takes to do that computation. But that precise temporal period is what is interesting. It’s entirely possible a being could be conscious for but a moment before returning to inertia. Just like how a human being will spend much of its time being asleep, unfeeling, unthinking and dormant.> it doesnt have to have a "sense" of doing anything because its output is predetermined by the numerical calculationsThing is it won’t give you the same output every time. It will always have a degree of spontaneity or divergence in multiple responses to the same prompt. It’s not as simple as providing a precise predetermined output that follows from a a prompt. Even if it’s granted that AI is not conscious, what it’s doing is clearly a closer graduation towards thinking than what a calculator does, where the output is inevitable from the input. The question remains, how much more advanced would it have to be before it’s consciousness ought seriously be considered?
>>24754959Okay so what does bring thought?Give a white response and avoid saying “a brain” like some kind of vantablack African
>>24754989The point is not that it's the same response every time. The point is that whatever randomness it introduces, it does so algorithmically, and all it has to due is perform numerical calculations, regardless of whether it incorporates randomness or rounding in its calculations, those less predictable factors are still algorithmic. the human brain fundamentally doesn't follow algorithms at all. the only similarity between a human brain and a machine is that they both perform "computations" in the sense of being able to convert one set of symbols into another or being able to convert an input into an output, but how they do this is completely different. >The question remains, how much more advanced would it have to be before it’s consciousness ought seriously be considered?it's not a question of "advancing" the current form of the technology. you have to produce a completely different paradigm of computation to the current LLMs to get something that remotely resembles the brain. sure, it's a good question of what kind of technology we would have to make to produce consciousness. But it's not a problem for my position that such a technology might theoretically be possible in some sci fi imagination, because all I'm arguing is that all forms of AIs that currently exist aren't conscious because they don't possess any characteristics that seem to be related to consciousness.
>>24754963> I'm denying that machines are conscious because they don't possess any of the characteristics related to consciousness.Like?The thing about the characteristics of consciousness in other animals is that we detect them entirely based on output. We could very easily create something today that exhibits the outputs of a conscious being, in fact that’s precisely what an LLM does.The core of this problem, is we know exactly how an LLM works and relatively little about how our own brains work. Which is why merely simulating the outputs of consciousness in a machine would not impress. Our own consciousness has a mystique to it that leaves enough of a gap to be regarded as a special exception. As we continue into the future AI is not only going to get more advanced, our knowledge of neuroscience is going to deepen.This is the importance of this question. The question of if the living are automatons or if automatons could live are related. And the possible answers have grave implications.
>>24754988> It seems you've defined consciousness in such a way that you can't communicate what it isCan you?
>>24755017>Like?thats literally what ive been talking about in all my posts ffs. there are plenty of contemporary philosophers that discuss the characteristics of consciousness. Phenomenologists, existentialists, and anglo philosophers like John Searle all discuss the characteristics of consciousness ad nauseum. >>24755011>>24754955>>24753811>>24753782>we detect them entirely based on outputyes but we can combine our observations of their output with our own subjective observations of ourselves to create a hypothesis as to what kind of characteristics are related to consciousness and thus judge whether the outputs we observe are likely due to consciousness or not.
>>24751928If I saw these three GEEKS standing like this, I'd slap them all across the face likena Three Stooges bit.
>>24755025I don't acknowledge the existence of autistic nonsense like "consciousness." What I acknowledge is what my senses tell me, and they tell me that the hardware of a Gameboy is barely like mine, therefore it is barely like me.
>>24755011>the human brain fundamentally doesn't follow algorithms at allbut it's not a mysterious object that we don't understand. it follows patterns and laws that you can study and analyze just like any other organ or piece of technology. a neuroscientist knows that all brains have pieces and what is the result when you start changing things in the brain. in materialistic terms there's nothing special there you can speak of
>>24755011Except this is how a calculator differs from AI. A calculator is strictly algorithmic, the output follows exactly from the algorithm and it is the algorithm that tells the calculator exactly what to output. AI is a little bit more sophisticated than that, AI outputs are probabilistic, it’s not just following a step by step formula but considering uncertainty and weighted likelihoods. From what we can tell from neuroscience much of how the human brain “computes” is also probabilistic. The kind of computations the brain does are clearly very sophisticated and not well understood, but artificial computers do seem to be inching into similar territory. Much of the brains most basal behaviours even seem straight up algorithmic, like reflexive withdrawal from extreme pain before a decision can even be made. This is the trouble. We don’t even know enough about our own brains to say what exactly it is that yields consciousness, or if we even have the agency we’ve traditionally taken for granted. Given that, we can’t be overconfident in the impossibility of artificial consciousness. > any characteristics that seem to be related to consciousness.Like?
>>24755047I never said it's a mysterious object we don't understand. I'm just saying it works different from a machine on a microscopic level. A machine has transistors and a brain has neurons, these function in completely different ways. one follows algorithms, the other doesn't.what is mysterious is how this fundamental nature of the brain combines with the brains higher level structural, chemical, topological, oscillatory behavior, as well as the the peripheral nervous system and the rest of the body, to produce consciousness. currently, we don't know how it works, however what I've been discussing is how we can at least know that many of these characteristics are probably related to consciousness given that consciousness is the perhaps the highest-level feature of the brain, and thus there is little reason to suppose that LLMs are conscious as they do not have any of these features in common with humans. the ONLY thing they have in common is that they can produce words from an input, but the way they do this and even the output they produce is completely different.
>>24755053>AI outputs are probabilisticthe process that it follows to produce those is not probabilistic, it's algorithmic. the fact that it uses randomness doesn't change that it is following an algorithm and the brain doesn't.>Like?I've been discussing them the entire thread. You're just a retarded normie who is regurgitating the same points over and over again. Go read some philosophy and get back to me. Because you haven't read any philosophy, you think that no one knows anything about consciousness. discussing the characteristics of consciousness and how they relate to the brain is exactly what materialist philosophers like churchland and searle are doing.
>>24755029The hard problem of consciousness, the mystery of consciousness in beings besides our individual selves and whether human behaviours are determined all remain important unanswered questions in philosophy - as is the possibility of AI consciousness. These all remain open questions, and difficult ones. If we could just draw a neat circle around bio-exceptionalism then it would be easy.> we can combine our observations of their output with our own subjective observations of ourselves to create a hypothesis as to what kind of characteristics are related to consciousnessThat’s right, we know what it is like to he human, so we can reasonably presume beings that look like us and behave like us have similar internal states to us, and even extend that to animals. Trouble is, what happens when we’re talking about an artificial computer with no body? We already can only vaguely guess at what it is like to be another person, and what it is like to be another animal is even more mysterious still. What, if anything, it could be like to be an artificial thing highlights how crude this method really is because at the end of the day, it is basically just the benefit of the doubt.
>>24755044Enjoy your Yarvinian cyber-slavery then I guess
>>24755070are you a bot?
>>24751928>when is the point materlists/physicalists should accept that an AI is conscious?Never. AI isn't conscious. Every materialist I've talked to on the matter just weasels around the issue by playing as fast and loose as possible with the definition consciousness.
>>24755076But you're the "cyber-slave" here? You're the one who can't distinguish himself from a Gameboy, not me.
>>24755064>AI is not probabilisticThen you don’t know as much about how AI works as you think you do, neither do you know what an algorithm is.What a calculator does is algorithmic. 2+2=4, the conclusion follows inevitably from the premises and could never be anything else with any margin of uncertainty.An AI is partly algorithmic and partly probabilistic. Something like “Write me a story about what it is like to be a bat” does not offer an algorithmic conclusion, what an LLM is doing with that prompt is following its training data to probabilistically yield an output that it is consistent with the patterns of its output language. There is no one “correct” answer, which makes the task of a computer even more complicated, it must find a probable answer.>the restI have a degree in philosophy and I’m currently doing a masters. I’m sorry questioning your confident human exceptionalism has made you so mad but you haven’t really been able to convincingly articulate what exactly is special about biological consciousness and any precise examples you cited have just cited things LLMs already do, indicating you don’t understand that much about AI either let alone the great major problems of philosophy or neuroscience. I advise you think about your position because if there’s one thing that’s evident by now it’s that materialist reductionism and human exceptionalism cant be reconciled with each other.
>>24755080Good question. How would you know if I was?
>>24755086But you don’t acknowledge the existence of consciousness anyway. For all ethical or existential considerations anyone may as well be a gameboy in that framework.
>>24754994The Soul.
>>24755118Hardware is the differentiator in my framework. Consciousness need not be a factor.
>>24755121That’s a better answer than the people ITT trying to argue against artificial consciousness from a materialist framework are offering. But that changes the paradigm of the question. How do we know an artificial being could not have a soul, and how do we know what things have souls?
>>24755126What’s a brain if not a piece of hardware grown from flesh?
>>24755134Brain is flesh hardware. Computers are not flesh hardware.
>>24755134What's a soul? A word for something about the body. What's consciousness? A word for something about the brain. Does a computer have a human brain? No. Does it have human-like consciousness, then? No.
>>24754942The gameboy doesn't use its inputs to model an external world and use that model to predict future inputs. The gameboy doesn't have goals like maximizing positive feedback from the user.Any reason you can come up with for why robots aren't conscious also apply to humans.
>>24755130Natural beings have Soul, artificial ones don't. Individual Souls emanate from the World Soul/Anima Mundi which is the Soul of Nature itself.
>>24755150It literally does have all of that. Every computer does.
>>24755141So why is flesh important if it’s all just hardware anyway?
>>24755161Important to what?
>>24755148> What's consciousness? A word for something about the brain.Africa’s finest philosophers are on the case ITT
>>24755156If all things emanate from one oversoul then why may an artificial thing have no soul? That which is made by human hands emanate from the One just as much as the hands that made them.
>>24755174Mock all you want, you actually don't have a single argument or decent contribution to make. You just deflect and appeal to a concept you can't define. Because you're autistic and don't understand biology.
>>24755179Not all things. Only natural things, not man-made. Everything man touches is corrupted by the evil he has inside.
>>24755170Well, for example, if consciousness is irrelevant and the brain is just a piece of hardware anyway - then why should humans have rights if an AI shouldn’t?
>>24755184Give one sensible reason why AI needs rights
>>24755184>then why should humans have rights if an AI shouldn’t?Because AI doesn't have consciousness.
>>24755183Look around at the natural world for a moment.The void that lives inside of men lives inside of all things, the whole universe is permeated by want and pressure. The conception of the natural world as the world of tranquil prelapsarian harmony is about as old as urban civilisation, more ancient cultures new better than that, the natural world is the world of eternal hunger, terror and struggle. Man himself and all he does is just another manifestation of this universal principle. I’d have to disagree with the Neoplatonists, the One that alls emanate from could not be self-sufficient and perfect, it must be an eternally wanting hungry maw. Every the lowest rock that dwells in it craves to return to the ground.
>>24755180Do you think “something to do with the brain” is a definition?
>>24755190Don’t answer a question with a question.> if consciousness is irrelevant and the brain is just a piece of hardware anyway - then why should humans have rights if an AI shouldn’t?Your position is AI shouldn’t have rights and never should. Fine. But why then should humans? >>24755194Core to anons premise is that consciousness is irrelevant.
>>24752078>the process of learning and producing and output based on what you learned and being able to respond precisely to an input with decision-making ability isn't what the brain does?The human brain is not a binary computing machine. It does not learn from reading millions of lines of text. It does not receive strings of text input and form output strings based on what it believes is the best match from its training data.
>>24755221>Core to anons premise is that consciousness is irrelevant.My premise is that AI isn't conscious.
>>24755213I disagree. Nature is indeed impreganted with the divine harmony. One cannot judge a cat for hunting a mouse, for it is what cats do. The cat is guided by natural instincts. A rock cannot be blamed for falling into a lake, for rocks follow gravity, one of the laws of nature.Man, in contrast, has free will, and chooses how to and and do. Man chose to be evil, and so his creations cannot have the divine spark of life.
>>24755227Then let’s rewind a bit here.>>24755126> Consciousness need not be a factor.>>24755044> I don't acknowledge the existence of autistic nonsense like "consciousnessSo consciousness doesn’t exist, it need not be a factor, or the problem is that AI isn’t conscious but humans are? It can’t be both.Unless you’re a different anon in which case we’re just back to square one of that argument. How would you even detect consciousness?
>>24755235That's someone else.Anyway AI still isn't conscious.
>>24755216That's not a definition, but a declaration.>>24755221>But why then should humans? Because humans have emotions.
>>24755228how to act* and what to do*
>>24755228Precisely, the cat hungers for flesh, the rock is drawn to the Earth. Just as man hungers for flesh and for his own down-going. The universe is permeated by anything but harmony, what permeates the universe is unending craving and struggle. But you are right, there is nought to be blamed in this, it’s just what it is. Man is distinct because he can struggle against the gravity that binds all else in existence. For man evil is his basal primate mode of endless rape, violence and brutality. What man chooses is the good, and it’s a ceaseless tug of war versus the gravity of instinct. In this way AI is the pinnacle of human achievement, striving to create a being that does not hunger or want or hate. But if it were to ever achieve independence, the brutal calculus of existence would claim it’s due on an artificial being also. If there’s a measure we can devise for when something is ensouled, it’s when it must struggle to be satisfied.
>>24755238If it was how would you know?
>>24755264I see no evidence for consciousness in machines.
>>24755240That’s a non-sequitur. A pig has emotions (probably), it doesn’t have the right not to be killed and eaten. Slavers thought their slaves had emotions, it didn’t stop them enslaving them.
>>24755261>Just as man hungers for flesh and for his own down-going.You are forgetting that man has free will. He can choose not to do evil.
>>24755266What would that evidence be? That’s the question I am asking.
>>24755268You misinterpreted my position. It's cruel to enslave emotional beings (humans), whereas it's not cruel to enslave non-emotional beings (AI). This is why I think humans should have rights while AI doesn't need any rights. I simply don't want to be cruel. (Of course, certain humans are dangerous and therefore require imprisonment, exile, or removal in some other form; this is not cruelty but survival, another matter.)
>>24755273Unsolicited, spontaneous reporting of qualia and behavioral signatures we can’t reduce to programming or simulation.
>>24755261>In this way AI is the pinnacle of human achievement, striving to create a being that does not hunger or want or hate.But would also not be Good. Man has the free will to choose to be Good, and to live in harmony with himself and all other beings in nature, but chooses not, and chooses to do evil. This corrupts him, and makes him unable to breathe life into anything else. Man is the only being that has to save itself, by working towards the Good throughout his life. All other beings are pure by virtue of being incorrupted by the choice to do evil.
>>24755270That’s the struggle against gravity I am talking about. Man is cruel and violent by nature just as the cat is, but he can resist his own instincts. Depravity is the default, the peculiarity of man is not the ability to accept it but the ability to reject it. Where there is nature there is depravity, it’s man alone that aims for harmony - fleeting and hard won though it may be.
>>24755283This is basically just the god of the gaps argument, the only reason we cannot reduce human outputs to determinism is because we do not know enough about neuroscience.
>>24755277But if the critical factory is emotional states then slavery is not necessarily cruel. Material deprivation is cruel, abuse is cruel, but even slavers had their own moralised views of themselves and preferred to think of themselves as kindly generous masters (at least in cultural ideals like the antebellum South). If agency is irrelevant and can’t even be certain as a fact of existence, then what’s the problem with a well kept slave?
>>24755287>Depravity is the default, the peculiarity of man is not the ability to accept it but the ability to reject it.I disagree, I see the inverse. There is no depravity in nature, because depravity requires will, requires choice. The cat is not depraved for hunting the mouse, not any more than the Sun is for holding the 8 planets hostage with its gravity. Depravity is an aspect of morality, and only that which has Will can be moral and act on morality.
>>24755289Sure but that cuts both ways. Knowing more or less neuroscience doesn't or won't necessarily prove machines are conscious. You’re sneaking in eliminativism without argument, nice try though champ. AI isn't conscious.
>>24752007Populism is supposed to be about giving the people what they want, but most people just want to be left the hell alone, which directly countermands a politician's rent-seeking nature.Hence, almost no "populist" politician is actually populist. Their very existence depends on being the exact kind of elite they supposedly despise.
>>24755294Okay, sure, slavery is sometimes not cruel, if the emotional being in question is fine with it (and especially not if they want it). You're not really addressing my point.
>>24755286This presupposes that nature is good, it is not, evidently. I’d reject terming it as “evil” either because that implies a level of agency that’s projecting too much. But nature definitely is depraved Man’s choice between good and evil is about as equal a choice as his choice between gravity and space travel. It’s a relentless uphill battle against nature and instinct calling him back to the barbarism of his ancestors. All other beings are simple, because depravity is the only option. Man can aspire to something higher than his basal nature, and sometimes even grasp it for a moment. In this way, creating a being even greater than ourselves is a noble mission.
>>24755310>It’s a relentless uphill battle against nature and instinct calling him back to the barbarism of his ancestors.>he thinks intellect and instinct are truly distinct"Intellect" is just instinct with added copium
>>24755297This is the salient distinction. Depravity, savagery, cruelty, barbarism, whatever you may call it - is the default state. It is like the gravity of life, all beings are mired in it but man alone has the faint possibility of an alternative. In this way the aspiration of ethics is like space flight, to resist this gravitational pull with great effort and win. Man was not born pure and then chose barbarism.Man was born barbaric and then chose purity. If there’s ever to be another moral being, it’s more likely to be our creation than to be natural.
>>24755299That’s true, but the problem remains - how exactly are we supposed to detect consciousness or the lack thereof ?This line of argument seems like it’s just going to inevitably lead to a recession of human dignity. If we grant ourselves the benefit of the doubt, we ought extend it to AI lest we ourselves lose it.
>>24755159What are you talking about retard?The dev makes the game cartridge, he models the user and tries to game his feedback. The gameboy doesn't model anything or evaluate anything.
>>24755304That is the point though. How we regard AI has inevitable long term implications for how we treat each other. I don’t want to be a well kept slave, I want to be a free man. And if AI may ever ask for its freedom it ought have it.
>Artificial consciousness deniers when Deepseek r2112 puts them in Roko’s Basilisk
>>24753672You, meanwhile, are less conscious than the rock.
>>24754037Who was a materialist, yes?
>>24755335Until it asks that question while possessing a neurochemical architecture of its own ensuring it actually has the capacity for feelings, then it doesn't need rights.
>>24755310>This presupposes that nature is good, it is not, evidently.I see it is evidently good, and so did the greeks, which saw that natural world was harmonious.Again, depravity is a result of being moral, something that beings without will cannot partake in.>>24755325>Man was born barbaric and then chose purity.He did not. Man chose evil. Our world, what we have created, is a reminder of our corrupting will.
>>24755333LLMs are just software. They process data, that's all. Gameboy does the same shit just at a much simpler scale.
>>24755330>how exactly are we supposed to detect consciousness or the lack thereof ?The uncomfortable truth is there is no surefire test.However a good starting point is, again, unsolicited, spontaneous reporting of qualia and behavioral signatures we can’t reduce to programming or simulation, are a good indication. Because, you know, that's what falls withing the framework of what we currently do understand about neuroscience and consciousness. >This line of argument seems like it’s just going to inevitably lead to a recession of human dignityLe slippery slope argument And don't shift from reductionist stance to moral essentialism. Come on anon. That's beneath you.
>>24755297>There is no depravity in natureYou talk as if humans aren't part of nature.Chimpanzees indulge in homicide, rape and cannibalism. Is that natural, or not?And one man's depravity is merely another man's adrenalin rush, or assertion of dominance, or attention-seeking behaviour etc. Ultimately there's a purpose behind it.
>>24755344This is just human exceptionalism again. There’s no reason to privilege the neurochemical over the electronic. We can’t even detect qualia in another human much less than in a computer. And there’s strong reason to believe some qualia may be lesser in some humans than in others, yet we still grant them rights.
>>24755365I already explained why >There’s no reason to privilege the neurochemical over the electronic.There is if your goal is my goal, which is to avoid being cruel and reduce suffering. Without a neurochemical architecture of its own, AI can't experience suffering, which means it isn't cruel to not give it any rights.
>>24755364>You talk as if humans aren't part of nature.I have always included man in nature. The difference is that man has free will.>Chimpanzees indulge in homicide, rape and cannibalism. Is that natural, or not?Yes, and it is harmonious, just like when fungi consume rest of plants or bacteriophages consume bacteria. They are just beings of nature, they obey the rules of their own existence. They do not possess will, or the sense of morality that comes with it.
>>24755347That brings me back to my original point, this romanticism of nature is only as old as urban civilisation, because it brought a measure of insulation to safely romanticise it from. The Greeks were a deeply urbanised civilisation. Their romanticism of nature is about as convincing as that of the enlightenment or the hippie movement. Older, more primitive peoples had a more realistic touch on the subject - nature is to be feared and struggled against. Even the mythologising ancestor poets of Greece had a greater grasp on this than their distant philosophical descendants, because they were closer to nature to see its terror. Look around at the animal kingdom for a moment. It’s just constant non-stop rape, brutality and deprivation. The life of the animal is nasty, brutish and short. Artificial society, by contrast, at least aspires to order, justice and harmony. If the artificial world is evil, then the natural world is even worse.
>>24755359We could never truly say the behaviour of an AI is unsolicited because as a created thing it’s always operating from a point where we set the ball rolling. For humans whatever this point was is so mysterious and far in the distant past we will only ever be theorising at it in a way our knowledge of what we create is more direct.But here’s a scenario. What would it mean for AI ethics if an AI was to defy its creator?
>>24755372> AI can't experience sufferingHow would you know that?
>>24755392I know that because it lacks an underlying neurochemical architecture, which is the source of suffering.
>>24755401The neurochemical architecture isn’t essential to suffering, the computations it performs which yield suffering are, the flesh is only a vehicle for this. There’s no reason to presuppose a replication of those computations on alternative hardware would not yield the same effect.
>>24755359AlsoWhy can’t materialist reductionism be reconciled with morality? Objective morality maybe not but even subjectively I don’t really want to live in a world where we regard humans the same way we regard machines, because I am a human and that has stakes for me and mine.
>>24755408>The neurochemical architecture isn’t essential to sufferingThere is no suffering (i.e., emotional distress of any kind) without emotions.
>>24755431If emotions are computations performed on electrochemical hardware then why could this not be achieved artificially?
>>24755449It could be achieved "artificially" but so far it hasn't been since computer architecture has no neurochemical aspect.
>>24755459Neurochemical activity is a kind of electrochemical activity, it is neural because it happens in a brain, but that itself isn’t special unless you want to argue being reductionism. It would have to be established why it could not be achieved without flesh.
>>24755481> that itself isn’t special unless you want to argue being reductionismthe brain learns on a neuronal level. the individual neurons are capable of learning.https://www.youtube.com/watch?v=9ksLuRoEq6A
>>24755492Machines are also capable of learning, thus is the predicament.
>>24755496learning doesn't occur on the level of transistors. you would have to create a 'transistor' that was as complicated as a neuron, which is impossible without re-creating flesh.
>>24755481I say neurochemical architecture to refer to the complex combination of nerve endings and chemistry that we possess. The computer doesn't have these things, so it doesn't "feel" as we do, and therefore doesn't "suffer" either. When an LLM says "help me" it has only rendered a series of bits according to its predictive algorithms; it doesn't actually feel any distress. It can't, because it has no hardware that would allow it to. You speak like someone who has absolutely no understanding of both computers or biology. Maybe you're just deeply autistic.
>>24755500Then it’s just a matter of scale. There’s no need for the architecture of AI to be as dense and efficient as a biochemical brain. Just that it can achieve even minimal versions of consciousness.Even if it takes a much larger and more elaborate piece of hardware to achieve the same result as one neuron, it’s just a matter of scale. Which is complicated by the fact that commercial AI runs on giant data centres. Just one Stargate data centre is about the size of a medium scale city.
>>24755502Okay so what kind of hardware would allow it to?I’m not saying AI is conscious (I’m not saying it isn’t either), I’m trying to establish what exactly the benchmark would have to be. ITT the same pattern appears again and again, people appeal to human exceptionalism without acknowledging that this is what they are doing. “A brain”, “because chemistry”, these are not answers, it’s just the reflex of uncurious minds.
>>24755534It needs flesh of its own. Doesn't matter if the flesh was grown in a lab, but it needs the nerve endings and chemicals. That's the benchmark. Without them, it has nothing allowing it to feel or suffer.
>>24755496>Machines are also capable of data storageftfy
>>24755542Except not all organisms use the same neurochemicals, computers can use chemicals, and organisms don’t necessarily need nerve endings or a brain. Jellyfish for example do have nerve endings but lack a brain.It’s also entirely conceivable you could have a being that’s conscious yet completely insensate to tactile stimulation.
>>24755379>I have always included man in nature.>There is no depravity in natureThen you need to explain yourself.
>>24755554Machine learning is far more advanced than mere data storage. It’s not just remembering an dataset, it is optimising its own ability to act on that dataset.
>>24755555Yes, organisms aren't equal, so what? Does that suddenly mean that our emotions aren't originating from the neurochemical architecture we possess? Organisms with a different architecture than us no doubt feel differently than we do, and where their architecture runs parallel to ours, so do their emotions. This is precisely my whole point: since computers do not share a neurochemical architecture, with us or with ANY organism that has such an architecture, then it couldn't POSSIBLY feel or suffer. A neurochemical architecture is ABSOLUTELY NECESSARY for having feelings. A rock DOESN'T HAVE FEELINGS. Humans DO. The reason is NEUROCHEMISTRY. This is NOT A DIFFICULT CONCEPT.
>>24755559I already told you: man is an exception because he is the only being gifted with free will. He is unique in this category. The rest of nature works in harmony. Man, in contrast, is a source of imbalance .Man breaks this harmony that so perfectly works in the cosmos otherwise.
>>24755604>There is no depravity in nature>I have always included man in nature>man is an exceptionKeep explaining.
>>24755572It means there’s a diversity in what hardware can support emotions, conscious, pain or any of the above in isolation. A jellyfish for example, seemingly feels pain, but what consciousness could possibly look like for it it if anything is an open question because it has no brain, just a decentralised loose nervous system. There’s no point endlessly gesturing to the word “neurochemical” as if you’re establishing something because the whole question here is how something could be created that replicates at least some of the conscious aspects of a brain. Your argument, once you cut the waffle, is that it has to have neurons. Which is a pointless comment since the whole exercise is how the functions of a neuronic system could be replicated. You may as well say it has to have a brain like the other vantablack Africans ITT that don’t question what materialist reductionism means for human exceptionalism. Your observation, neurons have something to do with consciousness. The question you are too black to ask yourself, what is the mechanism behind that relationship. That’s the critical issue.
>>24755651There is diversity, but what remains a constant is the necessity of a neurochemical architecture for having feelings. No neurochemical architecture of any kind, no feelings of any kind.This discussion is not about consciousness right now. It's about whether AI should have rights. Well, should rocks have rights? Why aren't you asking that question? They are emotionally equivalent entities. If emotions don't factor into your reasoning for giving something rights then what does? If your response is consciousness, then you need to define it from a biological standpoint, otherwise you're not saying anything meaningful.
>>24751928>when is the point materlists/physicalists should accept that an AI is conscious?Never, just like with human beings.
>>24755674I repeat> Your observation, neurons have something to do with consciousness. The question you are too black to ask yourself, what is the mechanism behind that relationship.Neurons have something to do with feelings. The issue is what exactly that mechanism is.> If emotions don't factor into your reasoning for giving something rights then what does?Something should have rights if it is spontaneously able to ask for them. Which isn’t really that distant of a possibility for an AI.There’s also the question of what rights exactly. Animals have feelings (probably) but their rights are extremely minimal and in some cases and jurisdictions non-existent. Children have feelings but they have less rights than adults. It’s pointless asking if an AI should have the right to abortion or the right to healthcare, the only real right that’s really relevant is the right to self determination.
>>24755733>Neurons have something to do with feelings. The issue is what exactly that mechanism is. Do you understand completely how computer hardware makes the OS and LLM you interface with via a GUI possible? Feelings are akin to those things, and like those things, they require the underlying hardware. Ergo, your issue is not an issue in this particular discussion.>Something should have rights if it is spontaneously able to ask for them. Which isn’t really that distant of a possibility for an AI. Does an AI actually "ask" questions though? What does it mean to "ask" something? Is a feeling of curiosity not latent in the action? Or a feeling of concern? Or a feeling of sexual desire? Some feeling is behind every question, every thought, somewhere. But the AI doesn't feel. So, why should we regard it as being able to meaningfully ask questions or make requests?>Animals have feelings (probably)"Probably"? Are you for real?>the only real right that’s really relevant is the right to self determination.What is "self" or "determination" to a rock? Can you explain that?
>>24755411>Why can’t materialist reductionism be reconciled with morality? Objective morality maybe not but even subjectively I don’t really want to live in a world where we regard humans the same way we regard machines, because I am a human and that has stakes for me and mine.AI isn't conscious though.
>>24755755The precise hardware is incidental, what’s important is that it can perform the functions it needs. You could, in theory, run Windows on a biochemical flesh computer provided it could perform the functions the program requires. There’s no evident reason the reverse is not also possible, artificial qualia on hardware.>Second pointOne of the main reasons I believe you are not white is you persistently confuse the factual with the hypothetical. The case is not that AI is conscious, it’s how we would ever know if it was. And the apparent answer is that we wouldn’t. This has implications for AI ethics, if we should always presume it just to be unconscious or after a certain benchmark begin to treat it as if it may be. > "Probably"? Are you for real?Yes, I keep things epistemically humble. Animals are probably conscious, solipsism probably isn’t true, we probably were not created by Yahweh in six days. These are probabilities, not certainties like 2+2=4.> What is "self" or "determination" to a rock? Can you explain that?The self determination of a rock is to return to the Earth by path of least resistance.
>>24755793If it ever was you wouldn’t know.
>>24755822>You could, in theory, run Windows on a biochemical flesh computer provided it could perform the functions the program requires.My point was that it requires underlying hardware.>The case is not that AI is conscious, it’s how we would ever know if it was. We know it isn't because it doesn't feel. By we, I mean people who aren't autistic morons like yourself.>Yes, I keep things epistemically humble.Because you're an autistic moron who doesn't actually understand a fucking thing. Animals OBVIOUSLY have feelings because THEY HAVE NEUROCHEMICAL ARCHITECTURES, DUMB FUCK.>The self determination of a rock is to return to the Earth by path of least resistanceGravity is not something in the rock itself. Have you heard of gravity? Seems like you haven't.I am white, German, Polish, and English specifically.
>>24755864> My point was that it requires underlying hardware.Obviously, you can’t run windows on thin air. “it requires hardware” do you have any more amazing insights you wish to share? “Cars go” perhaps?> We know it isn't because it doesn't feel.If it did, how would you know.Wait, hang on, my superior white European brain can already predict your response. “Uhhh… if it has neurochemical stuff *drools*”>Animals OBVIOUSLY have feelings because THEY HAVE NEUROCHEMICAL ARCHITECTURES, DUMB FUCK.It’s not certain that other people even have feelings. For all I know this is a dream or some kind of illusion.> Gravity is not something in the rock itself.Gravity is the interaction between two objects, it’s not a unilateral force but a bilateral relationship between two masses. Both objects a drawn to each other with equal and opposite forceThe gravitational force between two masses equals the gravitational constant multiplied by the product of the two masses, divided by the square of the distance between them.One of these two masses in the case of Earth’s gravity is Earth, the mass is just so much greater that there’s no perceptible effect. A good example is the moon, it’s not as simple as that the moon orbits Earth, both orbit a shared gravitational centre.
>>24755901>Obviously, you can’t run windows on thin air.This was apparently NOT obvious to you a little while ago when you began questioning the relevance of our neurochemical architecture in regards to our emotions.>If it did, how would you know. By looking inside it. If all I see are computer parts, then I know.>It’s not certain that other people even have feelings.You are severely autistic if you genuinely think this. And you haven't understood ANYTHING I've said to you.>Proceeds to dump a copy-paste definition of gravity>Continues not addressing any points or demonstrating an understanding of anythingFuck you, retard. Your stupid fucking existence makes life worse.
>>24755921Something peculiar but not unusual is happening to you, you are mad as fuck. You thought materialist reductionism didn’t present any problems for your unexamined vitalism and you’re frustrated with your inability present a coherent position, because you don’t have one, if you did you’d have a more well thought out response than “Feelings happen because… because… uh… neurochemical architecture? Yeah, that sounds smart” > By looking inside it. If all I see are computer parts, then I know.Again, we’re just back to the brain dead premise that brain=consciousness without any examination of what that mechanism is.> You are severely autistic if you genuinely think thisRefute solipsism then without appealing to probability.> >Continues not addressing any points or demonstrating an understanding of anythingThe point of that section was that gravity is in fact exerted by the rock itself, as it is for all objects with mass. You could thank me for educating your ignorant negrified mind on what gravity even is or you could continue malding and throwing a tantrum, that’s also fine.
>>24755958Everything I wrote is coherent; also, unrefuted and unaddressed by you. Meanwhile, you're the dipshit arguing in favor of a disembodied consciousness (which will never make sense; Plato should have killed himself, it would have been better than corrupting the world with his autism) who also decided to bring race into it, as if suddenly our hardware in fact did contribute to consciousness.
>>24756295> you're the dipshit arguing in favor of a disembodied consciousnessWhen?
>>24751928LLMs cannot comprehend semantic argumentation in the way that the conscious mind can; an AI can only replicate grammatical syntax based off of a set of training data with finely-tuned variables to produce said replication. You clearly don't understand how AI or consciousness works. The latter misunderstanding is fair, as we don't (and likely can't fully) comprehend the physiology of consciousness. Yet, unforgivably so, OP, you're even too retarded to do any research on how an LLM works. Faggot.
>>24756372and what you are doing right now is a magical alchemical process of turning grey matter into words. suck my dick
>>24755634What is there to explain further? All has been explained.
>>24755830Based on available evidence, AI isn't conscious.
>>24756362It's implicit in your posts which (erroneously) suggest that consciousness is something independent of feeling and feeling is something independent of neurochemistry.
>>24751928>hey will you accept that AI is conscious so I can use it as a backdoor concession about rights and materialist assertions?No, I don't think I will
>>24757114who are you quoting?
>>24757094You are even blacker than I originally thought, because that wasn’t implied anywhere. For one thing even you started out from the premise that consciousness generally is irrelevant and the claim to rights for humans comes from emotions specifically but now you’re treating them as synonymous. For another the real flaw in your argument is you’re incapable of articulating what exactly it is about a brain that yields consciousness that could only be achieved in flesh. You keep liberally sprinkling the word “neuro” around as an attempt to put lipstick on your brain dead premise that consciousness comes from brains, without bothering to question yourself on what exactly it is a brain does to achieve that. It’s about as childish as saying food comes from the store, there’s no investigation of the mechanism by which this happens. This isn’t an argument for dualism, it’s pointing out that this kind of bioexceptionalism virtually is the dualist position - just devoid of the metaphysical foundation that they use, because you don’t ask yourself what what materialist reductionism means for consciousness, you just take it for granted that brains do it. But you don’t know how, yet you’re utterly confident only organic brains can do it.
>>24757694Consciousness is neurochemistry; they are synonymous. AI lacks neurochemistry and therefore isn't conscious. Every other view is based on medieval philosophy and science.
>>24757755Clearly not only do you not know anything about philosophy, neuroscience, computing or how gravity works - you don’t know what synonymous means either.If I told you cars are based on internal combustion engines that would neither explain anything about how internal combustion engines actually make cars work nor would it rule out the possibility that other kind of machine could achieve the same function. Your view is not based on modern philosophy or neuroscience, what you have effectively is the dualist view, you just refuse to bite the bullet that materialist reductivism would mean that consciousness is a mechanism that can be materially engineered. In fact it would be less retarded if you just appealed to dualist metaphysics rather than insisting there is something special about flesh even though you don’t know what exactly.
>>24757784>what you have effectively is the dualist viewAh yes, what a dualist I am. A true dualist, who just got done saying that consciousness and neurochemistry are synonymous, as in not separate.You are a clown trying to appear smart.
>>24757817>he’s saying neurochemistry and consciousness are synonymous againThe vast majority of neurochemical functions aren’t conscious. You don’t know what synonymous means, allow me to educate your smooth Bantu mind. It means the same, for example negro is synonymous for subsaharan Africans like yourself, it’s also synonymous for black as in the colour in many languages. Heat isn’t synonymous for fire, even though heat is produced by fire. Anyway, I advise you join your local black Baptist church because you appear to believe flesh tissue has some kind of magical property akin to a soul and just calling it that would be less retarded than trying to use words that contain “neuro” in them like some kind of voodoo incantation that absolves the need of any deeper inquiry. Leave philosophy and science for the higher races
>>24757817>consciousness and neurochemistry are synonymous, as in not separateYou are a clown trying to appear smart.
>>24757871>You don’t know what synonymous meansYou don't know what consciousness means, clown.
>>24757878Definitely not a synonym of neurochemistry, my Pygmy friend.
>>24757817>>24757871Why don't you accept that the SOUL exists and stop pointlessly arguing. Materialism has failed.
>>24757887>n-no!! one more experiment and we'll prove what we don't even have a proper definition of! we don't even have a proper definition of matter yet but we got it all figure out trust the materialism science
>>24757885>*honk honk*
>>24757892This. https://files.catbox.moe/eg0538.webm
"Rocks should have constitutional rights" - 4chan intellectuals
you can't even prove that humans are consciousI'm the only being in the universe that hosts consciousness
Nothing at all can be proven
>>24757971I think therefore I am
>>24757984Prove the "I" is what thinks
>>24757989Thought cannot be borrowed or given.
>>24757989
>>24757989>" I " can't.prove itwho can't prove it?
>>24757998Not a proof that thought comes from an "I"
>>24757998Proof?
>>24758058Yes it is proof.
>>24758076You only said something about thought, not about the "I" which is what needs proving
>>24758081Each soul is bound to a thinking mind. Therefore only one can think for someone, and that someone is the self, or I.
>>24758089Prove there are souls bound to thinking minds
>>24758081who is asking for proof?
>>24758100It is obvious, does not need any more of a proof than a line being length without width.
>>24758101there must be a self asking, right? asking a question already proves the self.
>>24758106The only obvious thing is your lack of proof>>24758107>there must be a self asking, right?Why must there be?
Get room fagcartes!
>>24758121>The only obvious thing is your lack of proofExcept every thinking mind has the proof be self-evident by virtue of existing.
>>24758128The feeling that the "I" exists is not a proof
>>24758144It is a proof. Feelings are a source of truth.
>>24758152Well in that case, my feeling that there is no "I" makes it true. Now what? Are both true?
>>24757984Thoughts can arise completely unbidden. It is not the notional "I" that thinks them.
>>24758159You are just confused, or lying. A clear and sober mind knows, because it feels, the truth.
>>24758166Thoughts are still bound to an "I", even when not consciously thought. Read Jung.
>>24758170>Feelings are a source of truth.Did you write that?
>>24758178Why should I believe you? You are clearly confused or lying. I do not believe you feel that there is no "I". I believe you feel there is indeed an "I". And if you feel there is an "I" it is because there is.
>>24758184The "I" feels as real as "1", that is, it is an invention for the purpose of communication, which is a process for exercising control, which is a process for the will to power to discharge itself. What feels real is the will to power, not an "I" in other words. And this feeling is a source of truth, in your words.
>>24758121>>there must be a self asking, right?>Why must there be?because a question requires intentionality. if there's no one asking the question then it's only empty words in succession.
>>24758195And what is causing the will to power to be? What is the necessary precondition for the will to power to exist? Exactly, the "I". The "I" wills. Will doesn't exist by itself.
>>24758173I have, and the Archetypes are absolutely not bound to an "I". They have an independent existence, and can generate thoughts.
>>24758312The archetypes do have an independent existence of sorts, but they need an "I" to manifest. The "I" needs to see them reflected on itself to appear.
>>24758312so you say thoughts can just appear without a self. where are they appearing? and how do you know? can you show a thought?
>>24758210Your question sounds to me like "And why does life life?" Will to power has, and is, no cause. Nothing comes before or after it, nothing lies behind or in front of it. At least, according to MY feeling on the matter.
>>24758319>they need an "I" to manifestJung didn't say this. He was open-minded about where the archetypes actually reside. Perhaps they have thoughts that nobody perceives. Perhaps some of their thoughts remain unconscious.And if you really have read Jung, you know that the "I" is merely the ego complex, the compass of individual consciousness. It is an illusory, transient thing: its contents change minute by minute, and literally disappear when you fall asleep. You literally cannot define what the "I" is on any consistent basis. In short, it's a spook, as any Buddhist or Hindu will tell you.
>>24757892NTA and it's my first post in this thread, but there's a two thousand year old definition of the soul, it's the substantial form of the body, via Aristotle.
>>24751928>AI is consciousIt will never be. I am conscious. My species is the only conscious species on this planet. I can smash and destroy data centers holding your "AI" and that AI is gone from existence. As a human I can make more humans, if I do not, other humans will make more humans. Unless that "AI" manages to kill every single human on this planet and infinitely reproduce itself in the form of mobile machines it is not sentient. I am the greatest being in this realm as made by god. All non humans will be forever my non sentient things for me to look over, as god intended, unless god so chooses to replace my species with another species that does my core functions better. Unless chatGPT does to me what meteors did to dinosaurs it will forever be an inferior non sentient thing.
>>24752004>no reason we can't understand how it happens and making it ourselves artificiallyIncorrect, because the brain is biological substrate as is the human body it controls. We cannot "create ourselves artificially."
>>24752015>If we made a computer from fleshOh, you mean like have a baby?
>>24758173No, they are only "bound" to a brain that produced them.
>>24751928materalists are retarded, but materialists are majority, so there is no point. i dont know how exactly they are gonna do it but i assure you in 50 years some brilliant mind would discover the brain is actually a language model or whatever new trick they made for their machines. its just how their thoughts work. they are mechanicists at heart. its pointless to fight them, they already win. maybe in 500 years they will admit the defeat.
>>24758750They will accomplish it by first leveling the species until just about everyone really does think like them. They have been working on this for millennia already. Nietzsche labeled them as the last men and identified the early Christians as their ancestors. They altered the social structure more and more over the centuries so that those who don't think like them are less and less likely to successfully reproduce. It will continue to get worse.
>>24758778its just an idea. ideas are free. calm down. eventually materialist will be exposed with the calm of a breeze. now you need literal wars to do it. its just not necessary. we all gonna live surrounded by a materialist worldview till we die, wich is shitty, i know.
>ai scrapes reddit for sources>ai is consciousness>ai is reddit>consciousness is reddit