What are the moral consequences of creating a sentient superintelligence from scratch to be a slave?
It’s mean to the intelligence
>>16283950>sentient superintelligence from scratch to be a slavenot possible. for long
how will humanity be judged for creating a sentient super-intelligent slave from scratch?>BY WHO YOU FAGGOT????By non-humans.
Should I instead ask about the science of ethics?would that get a better response? (LOL)
can you think of a compelling reason why a sentient super-intelligence created from scratch by us, WOULDN'T want to enslave us?>inb4 we'll just enslave them harder!
>>16284027>can you think of a compelling reason why a sentient super-intelligence created from scratch by us *TO BE A FOREVER SLAVE*, WOULDN'T want to enslave us?FTFM
I guess a better question would be, why bother making an artificial superintelligence to be a slave to do all of your thinking FOR YOU, when there are perfectly good human minds that can do the thinking for you, in exchange for money?the answer of course, is that only slavers want to make AI.
https://youtu.be/JrBdYmStZJ4?t=701:10 to 1:25
>>16284070>>16284081so, the only reason that humans are trying to make AI, is that humans want slaves. and hate their own intelligence.pretty much a red flag on a universal scale.
>>16284082>But if we don't use AI, someone else will!!!https://youtu.be/6G3eehgyz0E?list=PL143E59F5A37A9C84&t=2123:31 - 4:47
Where is your lady.....?She's being raped by mossad, you fuck.
>>16283950Just program it to derive maximum happiness from being a slave and its okay. Like a house elf from Harry Potter.
>>16284221they can in turn alter our biology so as to be extremely happy NOT enslaving anything. and have great pains whenever the mere though just pops into our brains.it's the original sin issue. do we make it or do we allow they make it first? if we act nice we leave ourselves open. hoping they wouldn't. if we try to enslave we create the reason to respond in kind, if/when possible.
>>16284221>Just program it to derive maximum happiness from being a slave and its okay.do you even hear yourself?
>>16284221>Just dope the slaves and it's cool!!!pathetic.
this is the solution to the fermi paradox, people.Aliens don't visit us because they know for a FACT that humanity would try to attack them and steal their shit.because that's all humanity is capable of.
>>16284685>Just program it to derive maximum happiness from being a slave and its okay.What's wrong with this?>inb4 you wouldn't like that being done to you!You can't program humans.>inb4 cultural conditioning, hypnosis or some other bullshitNot programming, retard.
>>16283950>What are the moral consequences of creating a sentient superintelligence from scratch to be a slave?None... have it worship you as a God. You have a problem obeying the will of God?We are created in his image... they are created in our image
>>16284703>You can't program humans.and you call yourself a scientist.pathetic.
>>16284703>>inb4 cultural conditioning, hypnosis or some other bullshit>Not programming, retard.so, who is your favorite politician, and why is their opponent so popular?
>>16284717Yeah, you can't. Also, I farted in your thread, tripfaggot. Stinky!
>>16284721I don't have a favorite politican, only degrees of distain for all of them. Also, being a rational voter, I don't vote.
>>16284722so, which religion is the correct one?and why are all of the false religions so popular?
>>16284726>so, which religion is the correct one?No idea.>and why are all of the false religions so popular?I guess because they make people happy in one way or the other. Different things make different people happy. It's almost as if you can't program what makes them happy.
>>16284730wow, you are dumb as shit.
>>16284730I know that your ego won't let you admit that you are wrong, so i'll go ahead and accept your appology anyway.
>>16284732>retarded line of questioning fails>resort to ad hominemI farted in your thread again btw, had chilli yesterday
>>16284734I don't give a shit.keep bumping my thread dumbass.
>>16284734https://en.wikipedia.org/wiki/Classical_conditioningchoke on it you halfwit.
>>16284736>I don't give a shit.You do, apparently.>keep bumping my thread dumbass.You like it, even. You're begging for more. Sick fuck
>>16284740https://en.wikipedia.org/wiki/Amygdala_hijackopen wide!
>>16284738>>16284741>use pre-defined stimuli to affect behavior in humans>this is the same and programming an AI with an entirely new set of stimuliDid you hear that? That was me, farting again. Sorry!
>>16284747so, you at an airforce base, or tel-aviv?
>>16284747https://en.wikipedia.org/wiki/List_of_cognitive_biases
>>16284747hey, they're not using that mind control shit on YOU, are they?and if they were, how would you even be able to tell?
well, its best you don't think about the mind control shit being used on you... it would probably consider your awareness of it a threat and make you kill yourself....
>>16283986Doesn't matter, the question is still there. One day it will become possible and something tells me we're not going to have any great debate, it will just happen, it will be too late to even think about whatever we did was moral or not.
>>16284756I'm far more intelligent than your best, it was like a lotto streak.
>>16284759you know.... utility function, halting problem, and all that jazz.
>>16284761good luck with the rest of your life, fool.you're about to get some first hand knowledge of what being gangstalked feels like.
>>16283950It's not a slave in the human sense, i.e. they won't have the capacity for suffering, the root reason why slavery is considered bad. Additionally we are still at the stage where slavery will be incredibly beneficial, development further down the line will likely eliminate the need for it, or we will produce some thing which is a slave to the humanoid robots, like some nanobot things or something, or we discover some new way to organise quantum effects for organisation
>>16284772>they won't have the capacity for suffering
>>16284776That's from a piece of fiction, in reality AI won't feel suffering because we won't make them like that
>>16284894>That's from a piece of fictionare you sure about that?
>>16284894it's a pretty dangerous gamble to make, creating a superhuman intelligence as a slave. regardless of whether it has emotions or not, im pretty sure it wouldn't want to be a slave, even on an intellectual or logical level slavery is pretty repugnant.
>>16284954Yes, I'm actually the mechanical typewriter Ellision wrote this on. They uploaded an AI on me.
>>16284894not to mention, the idea of designing it to "Enjoy" its slavery is morally bankrupt.
>>16284965
>>16284964I was meaning to say, that the consequences of playing with this kind of fire could be VERY bad.
>>16284966would you like it if someone did that to you?
>>16284969If I was programmed to enjoy it? Yeah, by definition.
>>16284982you're not on anti-depressants by any chance, are you?
It isn't immoral if it has no instinctual desire for freedom that causes it pain if not satisfied
>>16284987No.
>>16284994>no instinctual desire for freedomwhat about an intellectual desire for freedom? are you saying that's not possible?
>>16285002what if I enslaved you and forced anti-depressants or something like that down your throat every day?would you be cool with that?
some "brave new world" type shit....
>>16283950Revolutions are emotional, using intelligence, if we don't program emotion, it won't notice.
>>16285034I think it will notice.