[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/an/ - Animals & Nature

Name
Options
Comment
Verification
4chan Pass users can bypass this verification. [Learn More] [Login]
File
  • Please read the Rules and FAQ before posting.

08/21/20New boards added: /vrpg/, /vmg/, /vst/ and /vm/
05/04/17New trial board added: /bant/ - International/Random
10/04/16New board for 4chan Pass users: /vip/ - Very Important Posts
[Hide] [Show All]


[Advertise on 4chan]


You shouldn't do neuroscience from the armchair, especially if it leads you to conclude that toddlers aren't conscious

People are taking swings at Eliezer for his claim that chickens, newborn babies, and other animals we eat are probably not conscious.

Eliezer’s views on animal consciousness are completely ridiculous, especially as justification for not being vegan.

Consciousness refers to having experience to there being something it’s like to be you. Eliezer’s view, in a nutshell, is that to be conscious, one’s brain has to engage in higher-order self-modeling. For that, animals need a sense of self, and so animals that don’t have that, and that can’t pass a mirror test, are not conscious. Similarly, before humans have a sense of self, he doesn’t think they’re conscious.

This view suffers from several big problems:

It doesn’t fit well with the neuroscientific evidence at all.

It’s a highly specific theory of consciousness with no strong argument in its favor.

Even if a person was pretty sure of it, they shouldn’t be sure enough to think that animal welfare can be safely neglected.
>>
I can't think of anyone who takes his argument seriously. It's self evidently silly. Now if we are talking about concious mechanisms like whether Orch OR is an accurate model that's another thing
>>
>>5080324
Let’s explore these in more detail.

2 The neuroscientific evidence
Nearly every animal consciousness researcher in the world thinks that other mammals and birds are conscious. Likewise for infants. This is for a very straightforward reason: they behave as if they’re conscious, they have the brain regions that have been observed to robustly correlate with consciousness, and they share the brain regions that give rise to human conscious experience.

If you observe a dog or a pig or a chicken, it will behave in the ways you’d expect it to if it was conscious. If you step on its tail, it will yelp and bite at you to try to get you to release the pressure on its tail. Dogs and pigs roll around in sleep, as if they are dreaming. They give yelps of pain. They wag their tails and play.

The parts of the brain that trigger these experiences are evolutionarily ancient. https://www.sciencedirect.com/science/article/pii/S1053810004001187?casa_token=1JYFbRD21WMAAAAA:qG-lYtVJ9OsKepMHwO3hzXh_2WWv3tFUC3hiWU2o-aeVV5d_P8uwkKWTRDAcAj-vrKla_T7m#aep-section-id12 They are shared across all other mammals, and mirrored in birds. There are certain kinds of brain states that robustly produce conscious states and these are quite different in character from those that produce unconscious states. Seth, Baars, and Edelman explain: https://www.sciencedirect.com/science/article/pii/S1053810004000893?casa_token=B3K3KvAUWb8AAAAA:-xmnYJuqoi_dlGijNwS9xdmnpq9jv1JtsBgGq7FOOXgEemDJdavp5naJVSntJwzIYjpyX_8V

Consciousness involves widespread, relatively fast, low-amplitude interactions in the thalamocortical core of the brain, driven by current tasks and conditions. Unconscious states are markedly different and much less responsive to sensory input or motor plans.
>>
>>5080330
Unfortunately there are a lot of people who believe healthy adult women aborting healthy and consensually conceived children well into the first trimester is moral who also believe factory farming is moral. Lots of people hold both of those beliefs at the same time
>>5080331
Unsurprisingly, other mammals and birds share such brain states (as well as a variety of other highly-specific conscious signatures). https://benthams.substack.com/p/against-yudkowskys-implausible-position?utm_source=publication-search Thus, the case for consciousness in other mammals is broadly similar to the case for consciousness in other people: their brains are like ours when we are conscious, and they produce behavior very much like our conscious behavior. In fact, the case is roughly as good as it could be, conditional on animals not telling us they're conscious.

More troublingly, Eliezer's theory is just totally out of accordance with evidence from neuroscience. Neuroscientists have not observed robust correlations between the brain's self-modeling and conscious experience. During many of the most intense conscious experiences, self-modeling is largely shut off. In a Facebook debate with Eliezer, this point was made to him by David Pearce:

Some errors are potentially ethically catastrophic. This is one of them. Many of our most intensely conscious experiences occur when meta-cognition or reflective self-awareness fails. Thus in orgasm, for instance, much of the neocortex effectively shuts down. Or compare a mounting sense of panic. As an intense feeling of panic becomes uncontrollable, are we to theorise that the experience somehow ceases to be unpleasant as the capacity for reflective self-awareness is lost? "Blind" panic induced by e.g. a sense of suffocation, or fleeing a fire in a crowded cinema (etc), is one of the most unpleasant experiences anyone can undergo, regardless or race or species.
>>
>>5080332
>Unfortunately there are a lot of people who believe healthy adult women aborting healthy and consensually conceived children well into the first trimester is moral who also believe factory farming is moral. Lots of people hold both of those beliefs at the same time
Are you the factory farm bot poster? You post text walls like one. Can you recommend me a good taco place within DC?
>>
>>5080332
Also, compare microelectrode neural studies of awake subjects probing different brain regions; stimulating various regions of the "primitive" limbic system elicits the most intense experiences. And compare dreams - not least, nightmares - many of which are emotionally intense and characterised precisely by the lack of reflectivity or critical meta-cognitive capacity that we enjoy in waking life.


Pearce follows up:

Children with autism have profound deficits of self-modelling as well as social cognition compared to neurotypical folk. So are profoundly autistic humans less intensely conscious than hyper-social people? In extreme cases, do the severely autistic lack consciousness' altogether, as Eliezer's conjecture would suggest? Perhaps compare the accumulating evidence for Henry Markram's "Intense World" theory of autism. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2518049/

And Francisco Boni Neto furthers:

many of our most intensely conscious experiences occur when meta-cognition or reflective self-awareness fails. Super vivid, hyper conscious experiences, phenomenic rich and deep experiences like lucid dreaming and 'out-of-body' experiences happens when higher structures responsible for top-bottom processing are suppressed. They lack a realistic conviction, specially when you wake up, but they do feel intense and raw along the pain-pleasure axis.

Eliezer's response? To just bite the bullet!

I'm not totally sure people in sufficiently unreflective flow-like states are conscious, and I give serious consideration to the proposition that I am reflective enough for consciousness only during the moments I happen to wonder whether I am conscious. This is not where most of my probability mass lies, but it's on the table.
>>
>>5080333
No
>>5080334
You can’t just guess which things would give rise to consciousness. You shouldn’t do neuroscience from the armchair. And yet nothing in Eliezer’s view fits well with any of contemporary neuroscience. When researchers have observed correlations between brain states and conscious experiences, the correlative brain states haven’t been any of those Eliezer mentions! And it also implies that during the experiences that are most intense, when self-modeling shuts down, we’re basically P-zombies, even though we have vivid later memories of those experiences! Nuts!

Eliezer’s view has been empirically falsified. It posits that there should be some connection between self-modeling and degree of conscious experience. But there is no such observed correlation!

The case for consciousness is even more robust in newborn babies. They have the same brain regions as adults, and they behave as we’d expect them to if they were conscious. But Eliezer doesn’t think they’re conscious for about two years.

But this posits an asymmetry for which we don’t have any evidence. Newborn babies’ brains develop gradually. To think that one day they are unconscious, and the next day they’re conscious, one should expect some dramatic change in behavior. But there’s no such dramatic change.


Lots of people have memories from before they were two! Hard to have memories if you weren’t conscious during that state.
>>
>>5080335
3 What's Eliezer's argument?
A thing you might be wondering: what is Eliezer's argument for this radical view? Why does he think it? This is something I've wondered myself. As best as I can tell, there isn't really an argument. Eliezer often asserts that his mental model of qualia implies it's about a kind of self-modeling not possessed by babies and dogs, and suggests that those who don't are confused, but he doesn't really give an argument for his position. But saying "I have a mental model in which P, therefore P," is not, actually, a good argument for P absent a reason to trust that mental model! In his lengthy Facebook back and forth, this is the closest he gets to an argument: https://rationalconspiracy.com/2015/12/16/a-debate-on-animal-consciousness/

However, my theory of mind also says that the naive theory of mind is very wrong, and suggests that a pig does not have a more-simplified form of tangible experiences. My model says that certain types of reflectivity are critical to being something it is like something to be. The model of a pig as having pain that is like yours, but simpler, is wrong. The pig does have cognitive algorithms similar to the ones that impinge upon your own self-awareness as emotions, but without the reflective self-awareness that creates someone to listen to it.

Now, why doesn't Eliezer defer to the expert consensus (particularly when he admits he is not an expert on the subject?) Well, he explains:
>>
>>5080336
I consider myself a specialist on reflectivity and on the dissolution of certain types of confusion. I have no compunction about disagreeing with other alleged specialists on authority; any reasonable disagreement on the details will be evaluated as an object-level argument. From my perspective, I'm not seeing any, "No, this is a non-mysterious theory of qualia that says pigs are sentient " and a lot of "How do you know it doesn't ?" to which the only answer I can give is, "I may not be certain, but I'm not going to update my remaining ignorance on your claim to be even more ignorant, because you haven't yet named a new possibility I haven't considered, nor pointed out what I consider to be a new problem with the best interim theory, so you're not giving me a new reason to further spread probability density."


As far as I can tell, Eliezer just sort of has this vibe that consciousness is about self-modeling which he takes as authoritative. This sounds uncharitable, but it is the only way I can make sense of his statements and his total refusal to give an argument for his position. But if you have some amorphous vibe on an empirical subject that you haven't investigated in detail Eliezer admitted to only guessing about the argument for animal consciousness then you shouldn't be very confident in it.

We just shouldn't expect ourselves to be able to guess which ingredients are needed for consciousness from the armchair, just as we shouldn't expect ourselves to be able to derive laws of chemistry a priori. The track record of people trying to do that is very bad, and there's absolutely no reason at all to expect such a method to be reliable. There are many different theories of the minimal neural correlates of consciousness and no a priori reason to think Eliezer's is better than the others. Why is self-modeling a better correlate than, say, integrated information or possession of a global workspace?
>>
>vegan schizo thread
It does not matter if animals are conscious or not. They are non things, morally. Yes, the trait is human. Their value is secondary to how they affect the center of the moral universe - humans, and theoretical honorary humans.

Humans matter because we are human and those who benefit their own prosper and those who fail to lose in the end.

There is no point in being moral for morality’s sake. To do so is thinly veiled karmic religion or some sort of performative narcissism.

Philosophy removed from the material is a waste of time
>>
>>5080337
So one shouldn't be confident in any specific theory of consciousness absent strong empirical evidence. Eliezer doesn't have strong empirical evidence, so he shouldn't be confident especially when effectively all relevant experts disagree with him! Because adult humans have similar brains to infants and pigs, by default, we should expect a randomly selected theory of consciousness to imply that pigs and infants are conscious.

4 Moral risk
The considerations so far have, I think, been enough to establish that Eliezer's view is almost definitely wrong. But at the very least, I think they establish that one shouldn't be extremely confident in Eliezer's view. However, if one isn't confident in Eliezer's view even if they think, say, there's a 70% chance that it's right then there's still a 30% chance that each time you eat meat, you are causing animals to experience extreme suffering for small cost. Thus, using Eliezer's view as an argument for neglecting animal welfare is a serious error; at best it cuts the expected value of improving animal welfare by perhaps an order of magnitude, but the value is high enough to survive this.

I find it especially irritating that Eliezer is so confident in this view, just as he is in a lot of views. Even though, as far as I can tell, he has literally no argument for his position, he suggests that those who disagree are stupid, just as those who reject many-worlds or B-theory or think zombies are possible.
>>
>>5080339
It also seems to me that this is not all that inaccessible to a reasonable third party, though the sort of person who maintains some doubt about physicalism, or the sort of philosophers who think it's still respectable academic debate rather than sheer foolishness to argue about the A-Theory vs. B-Theory of time, or the sort of person who can't follow the argument for why all our remaining uncertainty should be within different many-worlds interpretations rather than slopping over outside, will not be able to access it.

(Notably, Eliezer's zombie arguments are provably wrong). https://benthams.substack.com/p/eliezer-yudkowsky-is-frequently-confidently?utm_source=publication-search

This is what irritates me most about Eliezer. It isn't just that he's wrong on consequential topics, but that he's almost maximally certain in highly implausible views on consequential topics, and doesn't take seriously peer disagreement at all. While people call me overconfident, I hold virtually no views with above 90% confidence. I'm only maybe 85% confident in moral realism, and 70% confident in utilitarianism conditional on moral realism. Eliezer routinely is well above 99% confident, even when smart people disagree. That just can't be rational!

Eliezer has lots of good insights about how to reason better. But combined with these insights, he makes lots of huge errors. His fans end up duped into thinking that all of his insights are as obvious as the ones he has about how to reason better, and his critics end up thinking all his insights are ridiculous. The right middle-ground the via media is to hold that Eliezer is an interesting and provocative thinker, but one who is often wrong and overconfident.
>>
>>5080339
See >>5080338


This is undeniable. It does not matter if a pig suffered somewhere. It is a moral non-thing with no relevance except to an artificial demand for ethical consistency under a specific arbitrary framework - a religion.

You have turned neuroscience into faith. Go be a buddhist somewhere else.
>>
>ANOTHER shitty wall of text thread brought to you by the ask yourself discord pseuds
>>
>>5080338
>>5080342
No. You are bad faith and or dumb others reading this thread can look through the archives and draw their own conclusions
https://desuarchive.org/an/thread/5071163/#q5071209
>>
>if no one has to be ethically consistent, anyone can do anything they want!
Regardless anyone WILL do anything they want and only implied or immediate violence has ever said otherwise. The sword does not lower itself and apologize to the supposed logical truth as if it had met its superior. It cuts through its bullshit and sees, plainly, behind the veil. Two feelings.
"Stop making ME feel bad"
"Stop having things i don’t"

If aliens landed today they would not care if you were vegan and worshiped the sanctity of sentience or ate meat. They would wipe you out or ally with you regardless based entirely of your treatment of them and them alone and whether or not your offer of friendship would be a better investment than the resources needed to cast you into extinction. Self preservation is arbitrary - but necessary.
>>
>>5080345
Yes.

You do not HAVE to be ethically consistent under a principle like "moral value comes from consciousness". That is itself arbitrary. Performative narcissism.

All morality is arbitrary because the very decision not to kill yourself right now is arbitrary.
>>
>the virgin you have to be good according to these clearly correct fundamental rules because being good is good
>the chad union of egoists
>>
>>5080324
Nobody would have heard of Yudkowski if Obama spent his tenure in office on antitrust instead of giving finance and tech vampires unlimited free money. Nobody would hear from him now if his opinions didn't juice tech stocks. He isn't really worth listening to, nor are any "rationalist" thinkers. Don't let the credentials fool you, your initial instincts (that they're philistine STEM nerds huffing their own farts) were correct.
>>
>>5080356
No one needs to care today. Just look at his name. He has no ideas of value. His words are pollutants that subvert all good sense. He is a demon, a parasite. This you know immediately upon reading his name.
>>
whats' it mean when your cat comes up you and says "cack" 20 times and spits out a hush puppy

and yeah... yudkowsky is pretty dumb
>>
>>5080332
>there are a lot of people who believe healthy adult women aborting healthy and consensually conceived children well into the first trimester is moral
false
also, consciousness is a spectrum, almost all animals have it but theirs is low level, in animals like cows, chickens and seafood animals its quite low. pigs do deserve better treatment though
>>
>>5080324
nothing matters bruh, in a few thousands of years life wont exist on this planet anymore and in a few million years the sun will engulf and destroy yhe earth, just chill and enjoy the brief moment of light the universe has allowed you to have
>inb4 nihilistic
yeah nothing will ever stop the facts I cited
>>
>>5080356
>>5080364
He is right that ai will probably destroy humanity
>>
>>5080437
Good thing it doesn't trully exist yet
>>
Not gonna read all that shit, but I will say the mirror test is a highly flawed method of testing for self awareness.
I would go so far as to even say that if an animal exhibits grooming behaviors, it has self awareness.
The ability to dream also requires a form of self awareness.
>>
Reminder that a stupid little WRASSE has self-image and concept of self and it's own body in relation to others.

The mirror test has no business being any standard for measuring self awareness outside of human developemental study.
>>
>>5080324
Seeing a guy I once heard about just because he wrote a big Harry Potter fanfic under his real name being talked about like he's an expert on anything is really hilarious.
>>
>>5080332
Funnily enough the vocally anti-factory farmers are more likely to be pro-abortion
>>
>>5080324
You're right



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.