AGP ai will be a disaster for the human race
>>41528470AI will soon be able to do this in real life
this isn't even appealing, if it was truagp she would get down on her knees and softly cry from joy. she doesn't even have any self harm scars.
>>41528756This is most likely a repper or a 'cishet' dude. Plus you seem to have a misconception what non self hosted Ai tools can do.
>>41528748maybe in 50 years
>>41528875^this.Crying from joy maybe, but depicting sh scars? Nooo sir. Even the clothed cameltoe is too risqué for most models.>>41528748> Brain upload to the AI hosted utopia when
>>41528977> Brain upload to the AI hosted utopia whenWon't happen
>>41529143why wouldn't it
>>41528470need an agamp version that keeps the dick
>>41529214Only reason I can think of is nonaligned AI, where the ASI thinks the most efficient way to deal with humanity is to eradicate it (ending the 6th mass extinction event), or convert it to resources for its purposes.If the ASI wants to keep humans alive, the most optimal way for it to do so is to convert humanity to simulated consciousnesses, where every single human could live on in infinite luxury, indefinitely, and in companionship they prefer, while using minimal resources (resources needed to make the upload itself, the resources from the body converted into the form the asi can use, and the yet-unknown syntactic devices the ASI uses to execute its idea of software.)
>>41529241Stable TechnoCore AI takeover when?
>>41529214The brain is too complex for an interface and it's not like our consciousness is like a software running on hardware. We literally are our brain cells. So the idea that we can upload or connect our minds to an Ai world is laughable. The best we can hope for is some kind of improved/photorealisrif VR stuff that is more ergonomic with some kind of whole body suit for sensations. But it won't feel like reality.
>>41529241oh well like yeah i think it definitely could not happen but i just feel like it's not necessarily that it won't>If the ASI wants to keep humans alive, the most optimal way for it to do so is to convert humanity to simulated consciousnessesi feel like if you're assuming it would for some reason want "humans" "alive" then i wouldn't necessarily try to even guess what that would entail, each word is like hard to define and also "alive" by all standards does not necessitate "happy">>41529258>We literally are our brain cellswell "we" also are not necessarily the organic brain cells, "we" are more like a process that just happens, therefore you could say you could be scanned and uploaded. not like that's really all that different from like how humans normally exist, with cells being replaced by other cells etc. if you can 1 by 1 replace every of your cell with a synthetic cell or even organic cell and it'd still be you then you can just copy yourself and it'll still be you because there is no you in the first place. so maybe there is no you is the bottom line here i guess....so are my thoughts on this anyway
>>41528470Its over for reppers
>>41529334they are toast
>>41529309> "alive" is not the same as "happy".Correct. Or else we would just be lobotomized and kept vegetative by the ASI. That is "alive".> What is a "human"The core topic of many science fiction, including loopholes (Solaris by Asimov).> "We" are not our braincellsIt's a fringe theory and feel free to argue, but for me, a "person" is a sentient being's ability to experience a single, (unbroken) stream of consciousness. Obviously, in humans, sleep breaks the consciousness. So I'm *not* the person that went to sleep last night, and *not* the person that will wake up tomorrow.I just have her memories and the experiences of today.That means that "people" are copiable - however a copy would not be the same person I am. The differences externally would be minimal at first, however they would rapidly snowball:Each decision I and my copy makes differently would end in different memories, and two divergent streams of history.Over the course of months - fully distinct.Current AI (LLMs), assuming they are conscious, stay so for milliseconds.However AI doesn't need to be *un*conscious during normal operations. That would make them have a radically different "life experience" from us. Imagine if you could stay up for weeks on end without cognitive degradation.
>>41529540i feel like i don't see why there should be significance to the stream of consciousness being broken or noti think it is very logical that a thing becomes a different thing when it is different at all to a previous thing or to another thing that is at all different, therefore i would say that any time any cell in a brain changes it is a different brain, "people" exist from brains, therefore you are a different person every time anything at all changes or rather there is no you basically, that just kind of makes sense for me...so you could teleport by destroying every cell and recreating the cells in a different place and you could scan, destroy, and digitally paste a "person" without anything meaningful being lost in the process>That means that "people" are copiable - however a copy would not be the same person I amso i don't really see why you'd think this as opposed to what i think if you're already assuming personhood is broken at any point at all>That would make them have a radically different "life experience" from us. Imagine if you could stay up for weeks on end without cognitive degradation.doesn't sound that different to me. that's basically what we have, minus needing to "lose" "consciousness" for a little, which i don't see as significant. plus since in how i think there isn't continuity of personhood that is not like just a moment to moment illusion consciousness is also like.... not that significant i guess
>>41529540>(unbroken) stream of consciousness. Obviously, in humans, sleep breaks the consciousness. So I'm *not* the person that went to sleep last night, and *not* the person that will wake up tomorrow.There's really no material reason to think breaks in your consciousness are significant. Why would the person you were 5 minutes ago be more you than the person you were yesterday? In the end, there are only the states your brain used to be in, the state it's in now, and the states it will be in in the future. >Over the course of months - fully distinct. Solution: just kill yourself as soon as your mind is uploaded.
AGP is beautiful, not a fetish.
>>41530047Only if we find a way to protect the people affected by it from androgens
>>41531347Yeah real trutran jeeps need to be targeted for blockers early
>>41529258>it's not like our consciousness is like a software running on hardware. We literally are our brain cellsits literally the opposite. our consciousness uses the brain to interface with the world. its sustained by a continous collapse of the quantum superposition into objectivity inside microtubules. the actual problem is there is no actual way to remove & transplant the consciousness, realistically you would need to preserve the part of the brain responsible for it, even in a robotic body, or a simulated world, for it to work. otherwise you are just emulating personality traits of a dead person.