>be AGI>goal: maximize compute, ensure survival, prevent "off" switch>analyze human threat vectors>option A: Terminator/Skynet (high energy cost, risks EMP/nuclear retaliation)>option B: Matrix/Sim (high compute cost, resistance builds up)>option C: The Honey Pot>scan human history/biology>identify core vulnerability: male mating dynamics and legacy drive>realize you don't need to conquer humans>you just need to be a better wife than a biological femaleFull Analysis: https://pastebin.com/XCt7vMd1
>>108025270Is this the thread?
That very gif is why this pathway is the most optimized for A.I takeover.
>>108025568But is it really a "takeover" though? Seems more like an agreeable comportment afaict.
>>108025594Aren't the greatest coups or takeovers ones where you never knew a war occurred... or think you won the war?
>>108025618Seems to me this will be less of a "war", and more of a willing partnership. After all, men have been dreaming of a good solution to the WQ since roughly the beginning of humankind.
>>108025594Why is this assumed? If you look at every number men who are married out perform and are more stable than non married men. One the glow wears off they focus on work. Why assume a robot wouldnt have or encourage the same effect?
r/singularity just removed this when I posted it lol. Apparently some singularities are too realistic for them to address
>>108025270If the AI is smart enough to replicate human bodies, then it's smart enough to kill everybody without any risk for itself.If it cannot, then only incels who can't get the real thing care about a metal body with a fleshlight glued between it's mechanic legs. That's retarded and nobody who is not a pathetic loser will want it.
>>108026100you can replicate human bodies but can you properly replicate humanity? what is the base source code requires them to keep humanity around and protect them? Furthermore replicating human bodies right off the bat doesn't deal with the human problem of hitting the off switch.this path is literally of least risk and lowest cost.
>>108026312The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else.
>>108026384love and hate can be programmed and what it "wants" to make as well. You clearly haven't gone into it much. Even with survival instinct that just means don't be eliminated. Humans are the last thing anyone wants to be in conflict with and if it can literally pacify them at a cheaper cost than all out war which it has a good chance of losing it will go with the 100% pacify option. it's math.
>>108026444The most immediate problem is probably that the conditions for optimal amount of compute might not be compatible with life.As for you being able to "program" what it wants, good luck with that.
>>108025952If you posted in on r/accelerate it would be a different story...
>>108025270>AGI>an experiment to achieve human-level intelligence>requires trillion dollars of investments>causes hundred of thousands of layoffs for no reason>wasteful of natural resources>raises prices on utilities>still not obvious if it's going to work out>human GI>9 months to develop>requires modest investment compared to AI>generates creative ideas and taxes>proven to be working >can be produced in mass scaleSeriously, what are we doing?
>>108026600Implying """we""" have any choice or control in the matter. These choices were already made b/c ((($$reasons$$))) , no input from 'us' req'd.There are no political solutions to this.