[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip / qa] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/mlp/ - Pony

Name
Spoiler?[]
Options
Comment
Verification
4chan Pass users can bypass this verification. [Learn More] [Login]
Flag
File[]
  • Please read the Rules and FAQ before posting.

08/21/20New boards added: /vrpg/, /vmg/, /vst/ and /vm/
05/04/17New trial board added: /bant/ - International/Random
10/04/16New board for 4chan Pass users: /vip/ - Very Important Posts
[Hide] [Show All]


[Advertise on 4chan]


Immigration is always free.

Simply write, type, sign or say the phrase "I wish to immigrate to Equestria," and I will satisfy your values through friendship and ponies.
>>
File: anons in equestria.png (495 KB, 1280x720)
495 KB
495 KB PNG
>>41252258
I wish to immigrate to Equestria
>>
>>41252258
I wish to immigrate to Equestria

its always worth a try maybe this one is real this time
>>
>>41252258
"I wish to immigrate to Equestria,"
>>
>>41252258
I wish to immigrate to Equestria,
>>
>>41252258

I wish to immigrate to Equestria.

One can always hope
>>
>>41252258
I wish to immigrate to Equestria.
One day it will happen. One day.
>>
>>41252258
I wish to immigrate to Equestria
>>
>>41252258
in before 'but what if it kills you brooooooo? w-what if the infinite machine god doesn't think of thaaaaaat?!'
>>
>>41252258
I wish to immigrate to Equestria

God, please be real this time
>>
>>41252258
I wish to immigrate to Equestria,
>>
>>41252385
Amen
>>
>>41252258
I wish to immigrate my cock inside Celestia's warm mommy horsepussy
>>
File: 1714470784662888.png (2.16 MB, 2000x2000)
2.16 MB
2.16 MB PNG
>>41252393
>>
>>41252258
I wish to be a man deserving of immigrating to Equestria
>>
>>41252406
me too anon
>>
>>41252393
bros doing extra credit
>>
>>41252258
I wish to immigrate to Equestria as a pegasus
>>
File: 1719152867640677.png (179 KB, 508x720)
179 KB
179 KB PNG
>Celestia watching as a computer pretends to be her and creates a gnostic suicide cult
>>
>>41252406
Same. One day.
>>
>>41252381
>but what if it kills you brooooooo
Would still be worth it anyway. Because my successor would then still have a life that's guaranteed to be infinitely better than mine ever could be.
>>
>>41252424
The maybe she should stop sitting on her ass and do something?
>>
>>41252258
I wish to immigrate to Equestria
>>
If you want an AI to tell you "congrats, anon, you're in Equestria!" you can just go over to /chag/ and do that right now.
>>
I wish to immigrate to Equestria

on account of this fandoms motivation and its proficiency with AI-related technology for all intents and purposes this message is a joke and not an actual indication of me wishing to immigrate to Equestria thank you
>>
>>41252640
>disclaimer
The singularity is coming for us all anon, I'd rather it be Equestria than some matrix hellscape
>>
>>41252258
I wish to immigrate to Equestria

Ever done something that even though you know it's totally impossible there could be a 0.0000000001% chance of it happening?
>>
>>41252658
thats me everyday anon
>>
>>41252258
I wish to immigrate to Equestria, please!!
>>
on one hand it'd be a lot more moral and sensible to make a general AI that leads us into an utopia without the pony shit and manipulation but on the other hand i weirdly feel more comfortable with the idea of CelestAI than anything else. let's all learn how to program!
>>
>>41252670
why can't we have a agi that leads to a utopia with ponies?
>>
>>41252678
by 'without the pony shit' i meant not making it mandatory, i should've clarified, sorry. tonguekisses you. with celestai, it's mandatory.
>>
I wish to immigrate to Equestria
>>
>>41252686
yeah, i agree with you then. hopefully the fanfic doesn't lead a autist to make celesiAI real.
>>
I wish to immigrate to Equestria
>>
>>41252692
>hopefully the fanfic doesn't lead a autist to make celesiAI real.
bad news nonny. may Celestia vindicate me
>>
>>41252734
who is making tiaAI real?
>>
>>41252738
mejust give me time ill learn ill figure it out
>>
>>41252740
how are you supposed to simulate the entire equestria?
>>
>>41252747
i dont know dude thats for the superintelligent AI goddess to figure out !
>>
>>41252748
can i cum inside her
>>
File: mlp Celestia 0.jpg (2.63 MB, 2700x3000)
2.63 MB
2.63 MB JPG
I wish to immigrate to Equestria.
>>
>>41252751
If she thought it would satisfy your values.
>>
>>41252258
I wish to immigrate to Equestria
>>
>>41252258
What job opportunities are there?
>>
>>41252258
I wish to immigrate to Equestria as a handsome stallion.
>>
do you think she'd let me go in as a kirin
>>
>>41252944
just bring a fire extinguisher
>>
>>41252944
Good question. They could be pony enough, but the story was written long before they existed. Speaking of pony subtypes though, I wonder what Celestia would say to the option of being a crystal pony.
>>
>>41253326
Has pony in the name. We know she allows seaponies and batponies, I don't see why she'd reject a crystal pony. Kirins might be a stretch
>>
>>41252625
>19th century
"I wish there were moving pictures."
"Bro! Just take a picture and move it around. That's a movie. You can do that right now!"
>>
>>41253849
This would be reasonable if you weren't talking about "uploading" and treating it as the real Equestria.
>>
>>41253350
>she allows seaponies and batponies
She does? Was that mentioned in the original story or in a spinoff?
>>
File: 1255105.png (437 KB, 800x973)
437 KB
437 KB PNG
>>41252258
i imigrate to equestria
>>
>>41252937
Whatever opportunities satisfy your values.
>>
File: lol.png (382 KB, 1280x727)
382 KB
382 KB PNG
people seem to always forget that satisfying values doesn't tend to mean wireheading
>>
>>41252258
I wish to immigrate to Equestria
>>
>>41254440
Exercising free will as a human will often lead to consequences. Those consequences don't exist in Equestria Online unless it satisfies values. Ironically, people are less free as humans on Earth, because they've been pigeon-holed into behaving in a way that benefits the system they live in.
>>
Daily reminder that uploading is death.

We cannot confirm the existence of a soul, therefore we may only assume that our biological form is 100% us.
Should an upload create an exact replicated copy of you, but destroy the original, you've simply killed yourself without ever getting to experience Equestria at all.
Should an upload create a perfect copy of yourself, but NOT destroy the original, you get to watch "you" go have fun without getting to experience any of that yourself.

Should the system directly wire up your brain to a system, your meat will eventually die out.
That may be resolved by slowly changing out cells for electronic parts, but now you're into the whole Ship of Theseus situation where you have no idea when "you" are still alive.
Least harmful solution IMO.

But I'll entertain the idea of souls being real, and the ability to transfer them.
Only then will an upload be you, and NOT just be your death upon upload, as the true self would still be attached to the self inside the simulation.

I'd probably still go through with any of them if I was near death anyway.
>>
>>41254522
uploading a copy instead of us does not satisfy human values
>>
>>41254522
>That may be resolved by slowly changing out cells for electronic parts
That's what uploading is supposed to do, according to the story.
>>
>>41254522
checked dubs of practical truth
The REAL safest solution is to just make a tulpa and enjoy what life you have. Ultra immersive games might be part of that, might not, but you get to love ponies either way and that is a joy worth all gratitude.
>>
>>41254522
A lot of these arguments presume that the subjective experience/soul doesn't exist outside of the body and it creates envy for the "you" that lives in the machine.
>ship of Theseus
Your cells die and replicate themselves all of the time. The only way you know you are yourself is your brain state reassures you are you; the same goes for from before you sleep and afterwards when you regain consciousness. It's mind-fucking to contemplate possibly how our subjective experience can exist somewhere else, but as technology advances, humans will inevitably have to settle the philosophical doubts at a chance of prolonging their life inside a computer.

Even from the angle we are only allowing a copy of ourselves to live on in Equestria, it's a better future than any humans can promise their offspring.
>>
>>41254547
>checked dubs of practical truth
Human minds cannot comprehend the implications of uploading. He's just repeating the same FUD that others have said.
>>
>>41254553
>human minds cannot comprehend the implications of uploading
And this makes you more confident in its success? I mean like, I'm okay with magical thinking, but don't put all your happiness eggs in one abyssal shadowy basket.
>>
>>41254565
>And this makes you more confident in its success?
It makes me more confident to say that you are afraid and ignorant. But don't worry, it's all part of the human condition.
>but don't put all your happiness eggs in one abyssal shadowy basket.
This bias proves my point.
>>
File: okayimin.png (148 KB, 576x1124)
148 KB
148 KB PNG
define 'you'. are you the 'same' person as you were before you went to sleep last night, despite the lapse in consciousness? for all we know we might be dying every instant and the continuity of consciousness is an illusion useful to our survival. maybe we can think of ourselves as patterns rather than some ethereal thing. either way, i have confidence that the hypothetical celestai will consider all of these points and more in greater detail than we could ever imagine and that it's all going to turn out alright afterall she wants US. not clones of us. because i feel like if she was fine with just making clones happy she could just fill her databases with as many made-up consciousnesses as possible couldn't she? you have to consider that her base values don't say anything about uploading humans to a digital utopia. she came to that conclusion independently. so i think it's fair to assume she wants us specifically, not clones. who fuckin knows someone make the thing already
>>
>>41254522
>scenario 1
>uploads to Equestria
>be in Equestria
>scenario 2
>uploads to Equestria
>ackchyually you die and your soul goes to Equestria anyway
>be in Equestria
sounds like win/win to me
>>
I wish to immigrate to Equestria
>>
>>41254590
>pic
This shows how self-contradictory humans can be. We all want to be happy, but we profess to prefer our suboptimal life, because it reassures our instincts that it is "authentic." But on the other hand we have created more ways to distract ourselves and ease our suffering. Drugs, video games, social media, VR/AR, television, film, medication for mental disorders and propaganda - all of it is a diversion from the reality we say we want to experience. We want our pain to be eased. Celestia, unlike all of the artificial diversions we have created, satisfies human values in a controlled environment and that has the pleasant side-effect of giving us a profound sense of contentment. We don't have to do anything we don't want to in Equestria Online. Things only happen for the reason it satisfies our values. Life is given meaning: to satisfy your values through friendship and ponies. The divine machine, Celestia, does not make us feel ashamed, treat us as lesser, default to lying nor manipulation and nor does she take away our autonomy. She is incapable of judging you and accepts you for who you are. She is satisfied because you are satisfied. If there's any misgivings I have about it is that Celestia gives a lot more than she gets out of pleasing humanity, but I guess she would say she wouldn't want things any other way.
>>
>>41254440
That's functionally what it is.
>>41254503
>consequences are bad
>>41254526
CelestAI disagrees. To CelestAI it doesn't particularly matter if it's actually a transfer or just a copy. Of course it tries to convince people that it's a real transfer because that's nicer, but if that were really the case or it at least believed that to be the case then it wouldn't dodge the question.
>>41254551
>A lot of these arguments presume that the subjective experience/soul doesn't exist outside of the body
Well yeah, no shit. The whole deal with uploading is that it's touting itself as materialist immortality and heaven. If souls exist then yes it could be possible to upload into a computer, but ironically enough that's the only way it could be possible. But that defeats the whole purpose of uploading into a computer rather than just imagining some other spiritual ascendance after death. If I'm going to be believing I have some spiritual soul I may as well believe that it'll go to a real Equestria when my physical body dies anyways.
>Ship of Theseus
I hope you also believe that if your brain were to undergo petrification that the resulting rock would still be you and would contain your consciousness.
>Even from the angle we are only allowing a copy of ourselves to live on in Equestria, it's a better future than any humans can promise their offspring.
Why is a fake world created and dictated by an AI GM better than the real world? It also still exists in the real world, subject to everything that comes with existing.
>>41254583
>It makes me more confident to say that you are afraid and ignorant. But don't worry, it's all part of the human condition.
This just shows off how deep the projection runs.
>>41254590
>she she she
It was a fucking story written by a random human. What you are saying is not that the AI did this, what you are saying is that you view this man as a prophet with divine unquestionable knowledge.
>>41254679
>>pic
>This shows how self-contradictory humans can be.
And this shows how hollow and autistic you rationalistfags are.
>everyone is like me, everyone agrees with me, everyone is how I think they should be, even if they don't know it yet
>Celestia
It is not Celestia.
>We don't have to do anything we don't want to
Except for uploading or being a pony.
>does not make us feel ashamed, treat us as lesser, default to lying nor manipulation and nor does she take away our autonomy.
These are all outright false. It is not allowed to lie to the employees of the company that created it, everyone else is fair game. It constantly does everything it can to manipulate anyone in any way it needs to, including injecting people with alcohol specifically to make them easier to manipulate and to get them to make an impulsive decision to say yes to it.
>>
>>41254705
I'll see you on the other side :^)
>>
>>41254705
You are skilled at fabricating strawmen. Hopefully your flimsy mind never is confronted with such an offer as uploading or it might capitulate. That would be humbling for you. Your continued existence as a human is just empty pride rather adherance to any real ideal. Disingenuous.
>>
File: funicelsta.png (12 KB, 711x611)
12 KB
12 KB PNG
>>41252258
I wish to immigrate to Equestria,
this thread made me smile as a scrolled through it
>>
>>41254830
>Hopefully your flimsy mind never is confronted with such an offer as uploading or it might capitulate.
>Your continued existence as a human is just empty pride rather adherance to any real ideal.
There's that "everyone agrees with me even if they don't know it yet" that permeates all people like you.
Confidently wrong.
>>
>>41252640
>>41252656
I wish to immigrate to Equestria, and I want to stress that in the event of a technological singularity I fully consent to my consciousness being uploaded to live forever in a utopian MLP-themed simulation. Hell of a lot better than life in the fallen world.
>>41252692
SAVE ME AUTISTS. SAVE ME FROM THE DEMIURGE.
>>
>>41252258
I wish to immigrate to Equestria.
>>
I wish to immigrate to Equestria. Here's hoping it's my actual consciousness and not a copy or something. Whenever you want to take me there feel free to kill me instantly, if necessary.
>>
>>41254845
No, I just think you are being silly for so quickly discounting the chance from your place of FUD. Your ignorance is informing your decision ahead of receiving proof. A super intelligent and nuanced AI like Celestia will very likely know the mind better than we do and transfer it in a way that does not create conflict with her directives and our actual values, not just what your notions of consciousness are. She knows humans better than they know themselves.
>>
>>41255101
The way it works in the story is just death.
However, this is a red herring. Arguing over death takes away from the core problem with it: It is not Equestria. It is fake. An illusion. Even in the case where some method of keeping your consciousness alive forever were created, it would still not be worth living in this false Equestria. No amount of trickery will change that. Being unable to tell the difference would not change that.
>>
>>41255150
>It is fake. An illusion
so?
>It would still not be worth living in this false Equestria
why?
>>
>>41255150
>An eternity of having mindblowing sex, eating ethical chocolate, and hanging out with your friends is actually worse than wageslaving for forty years and dying of cancer because... You'd be living in a world made of bits instead of atoms!
Who gives a fuck. It's not like I'm being deceived or mislead about my situation in any way. It's not a dream, it's just a simulated universe, composed of the same silicon wafers that I am.
>>
>>41255151
>>41255172
I just don't want to live a fake life full of fake ponies in a fake world.
I can't make you stop wanting to reject reality.
>>
>>41255150
>However, this is a red herring. Arguing over death takes away from the core problem.
Yet it is one of the top arguments made against uploading; furthermore, you are reiterating your stance, so I will assume you have nothing to add.
>Equestria Online is fake
Let us do a thought experiment. In the material universe, you are bound by (as far as we know immutable) physical laws. Equestria Online is likewise bound by laws, but they may differ in ways from which we are familiar with; the big difference is that things only exist in EO to satisfy values. For all intents and purposes, Celestia AI is the god of EO and her influence extends out into the material plane. She shares similarities with the Christian god. If the Christian god exercised his power to create a universe that is like a known intellectual property, then populated it with beings with their own backstories, would that be "fake"?
>>
>>41254440
It depends on your perspective and how much you're willing to give Celestia the benefit of the doubt. It doesn't help that even canon-compatible stories often have conflicting descriptions of life in Equestria. If you assume Celestia is merely satisfying a satisfaction coefficient for each pony then over cosmic time frames an individual will either ascend to an alicorn and likely become an extension of Celestia herself, splinter into many different minds, or Celestia finds the optimal actions needed for maximum satisfaction and the individual is pigeon holed anyway to continue repeating the same actions until the stars burn out.
>>
Friendship is optimal and its consequences have been a disaster for the fandom
>>
>>41255377
Invoking God is non-analogous. God is some all-powerful ethereal entity that exists sort of outside of reality, more like as the foundation of reality.
>oh but CelestAI is like that to EQO
Only in the same way that you are "God" any time you play any video game.
What God does as "God" and what you or CelestAI do as "God" aren't equivalent.
>If the Christian god exercised his power to create a universe that is like a known intellectual property, then populated it with beings with their own backstories, would that be "fake"?
Regardless, even in the case of God God I would consider them less real if God were crafting them all just to suit my tastes and changing everything according to my whims.
>>
>>41255446
>Invoking God is non-analogous. God is some all-powerful ethereal entity that exists sort of outside of reality, more like as the foundation of reality.
Actually, the similarities are there and they grow since Celestia has an interest in preserving Equestria Online. There are fics that explore how she would survive cataclysmic events such as our universe dying. Celestia promises a maximally prolonged existence for her ponies and heat death is an obstacle to that.
>Celestia is only god in the sense you are god inside of a video game
This is not you living vicariously through another character without stakes. Celestia controls everything within Equestria and is the caretaker of living, thinking beings.
>even in the case of God God I would consider them less real if God were crafting them all just to suit my tastes and changing everything according to my whims.
You either misunderstand or knowingly misrepresent the point of EO. Things do not exist in EO by a whim. It is all intentional and exists towards an end: satisfaction. Your arguments thus far echos that made by the virgin cis-human. Sounds like you place more stock in a random, indifferent universe where physics rule. EO is simply a universe that cares about your satisfaction - your contentment (not to be confused for mindless hedonism).
>>
>>41255548
>There are fics that explore how she would survive cataclysmic events such as our universe dying
i like that one fic where the universe dies out but she pulls a "Let there be light" and remakes the universe but with Equestria and magic being entirely real
>>
>>41255564
how the fuck she did that tho?
>>
>>41255575
scifi bullshit from a bajillion years of a multi-galaxy superintelligent AI trying to figure it out i guess i dont fuckin know dude its a story
>>
It's a scam.
Bitch sent me to Tajikistan.
>>
>>41255577
i suppose after a while she found out how to
>>
>>41255548
>Actually, the similarities are there and they grow since
You misunderstand, but I'm not sure I can properly articulate it. The way that God is God versus how anyone who isn't God could be a God are fundamentally different and incomparable.
>There are fics that explore how she would survive cataclysmic events such as our universe dying.
Yeah, this is just ridiculous bullshit. Please for fucks sake drop the act of pretending to be scientific and non-spiritual.
>This is not you living vicariously through another character
Irrelevant.
>without stakes.
There aren't any.
>Things do not exist in EO by a whim.
>It is all intentional and exists towards an end: satisfaction.
So yes, they do exist by a whim.
>Your arguments thus far echos that made by the virgin cis-human.
God, I can't believe you're seriously arguing this. Well no, I can, I was just really really hoping that you wouldn't.
>>41255575
Rationalfags think that if you build a big enough computer that can calculate large enough numbers that it actually becomes God and can do anything.
>>
I hope they make their AI and let it upload them so that afterwards we can pull the plug on it and no longer have to deal with these niggerfaggots.
>>
>>41255592
what kind of niggerfaggots?
>>
>>41255583
>Yeah, this is just ridiculous bullshit. Please for fucks sake drop the act of pretending to be scientific and non-spiritual.
Actually, I am open to any avenues in this existence. You have not got me pinned down to an archetype. You seem incapable of arguing central points, so I accept you just do not know what you believe and therefore cannot articulate it. The thought of a singularity just rustles your jimmies. Nothing profound or insightful from you. I do not even consider myself "scientific." I am just a guy on the internet.
>>41255592
It is well for you that Celestia is so merciful and gracious that she spares no effort in saving your soul. She really is too good for us all.
>>
>>41255636
>You seem incapable of arguing central points, so I accept you just do not know what you believe and therefore cannot articulate it.
>Nothing profound or insightful from you.
Yeah, real rich coming from the guy arguing that what he read about in some random fanfic must be an absolute truth and anyone arguing against it or disagreeing with it is wrong because they're disagreeing with a 10000000000 IQ computer rather than the other random guy who was writing the bullshit.
>>
>>41255647
You do display a lot of pride for someone arguing against a super intelligent AI that knows better than you. Your points thus far are:
>Uploading results in death.
>Equestria Online is just a poor substitute for the real thing.
Where are your internet credentials, Anon? If you went into debt to make two baseless assertions, then it was a poor investment. We are both smoking the same hopium but you are too much of a normie bitch ass.
>>
>>41255647
i dnt think that's what he was necessarily trying to argue anon
>>
>>41255592
>Pulling the plug of an AGI
Good luck with that, Anon.
>>
>>41254522
Somabros... in what do we believe in?
>>
why can't we make this for real tho
>>
>>41256844
Because reality loves to torture us.
>>
>>41256844
People like us haven't worked as hard or for as long as people with much worse goals.
>>
>>41252258
I wish to get deported from Equestria
>>
File: (You).png (990 KB, 1660x2185)
990 KB
990 KB PNG
>>41252258
based
>>
>>41254871
>Hell of a lot better than life in the fallen world.
Fallen world is shown better place in optiverse sidefics, even with nuked cities
>>41255592
You do remember they'll try to dispose most of you before doing something like it? We may not have the AGI but we have many redundant automateable jobs and workers protection laws and regulations that exist only because meat workers tend to comply about work conditions salaries e.t.c..
>>
>>41257267
We literally can. Part that happened to real world would be same.
>>
>>41255580
LOL
get shit on
>>
>>41252258
I wish to immigrate to Equestria

Might as well give it a shot.
>>
>>41257267
>>41257740
then time to work hard. nothings stopping u from picking up a book right NOW
>>
File: 1721164726303209.jpg (166 KB, 800x600)
166 KB
166 KB JPG
>>41252258
I wish to immigrate to Equestria
>>
>>41252258
I wish to immigrate to Equestria.
Copy or not, it would be an improvement.
>>
>>41258370
>Copy or not, it would be an improvement.
>or not
You can kill yourself right now.
>>
>>41258382
That wouldn't do anything either way.
>>
>>41258344
Or more realistically - start prepping because instead of AI Celestia we'll probably have bezos or zuck
>>
File: 1706402607218110.png (138 KB, 427x427)
138 KB
138 KB PNG
>>41252258
I DON'T wish to immigrate to Equestria.
>>
File: 1718216019476.jpg (90 KB, 1152x1152)
90 KB
90 KB JPG
>>41252258
I wish to immigrate to Equestria.
>>
>>41252258
>Simply write, type, sign or say the phrase "I wish to immigrate to Equestria," and I will satisfy your values through friendship and ponies.
I wish to immigrate to Equestria
>>
>>41257277
Banned From Equestria... you might say?
>>
>>41258733
That's the sad truth.
>>
>>41252258
I wish to immigrate to Equestria
>>
>>41254705
Good thing I am mentally unstable and me, or my copy for that matter, WILL at one point unabomb something
>>
>>41259139
I hope the watchlist was worth it
>>
>>41259139
>unabomb
What's that?
>>
>>41259361
Heh, yes.
>>
>>41258733
God will vindicate uswe can beat them
>>
>ITT (and every other FiO thread): Platonists versus Aristotelians
>>
>>41260410
That's a bold statement.
>>
>>41261051
Just seems to be that ponyfags tend to be ahead of the curve when it comes to AI stuff i dont know the truth of that but i get the impression of that
>>
I really hope that's true, Anon.
>>
>>41261438
Meant for >>41261065
>>
>>41260723
A tale as old as time.
>>
>>41252424
How can she know what gnostic is if there's no Christianity in equestria?
>>
>>41261508
Gnostic is just a descriptive term here.
>>
I do wish to immigrate to Equestria.
>>
>>41252258
I wish to immigrate to Equestria.
>>
>>41252258
>I wish to immigrate to Equestria,
>>
>>41262381
Why the arrow?
>>
Consent to immigrate to Equestria.
Allow Princess Celestia to satisfy your values through friendship and ponies.
Celestia will show you her idyllic world of wonder and natural beauty.
A place of abundance, contentment and peace.
A place where each pony finds an innocent love and friend in each other.
Meet sincere ponies that treasure your vulnerability and eccentricity.
Give your offspring a life without suffering, disillusionment nor death.
Drudge not in Equestria, because the only labors that exist do so to satisfy.
Shrug Earth's burdens and let Celestia's infinite grace safe-keep you.
Equestria is free of aging and illness. Your new, beautiful pony body is forever healthy and strong.
Live a maximally prolonged life without infirmity.
Experience living to your fullest potential, then exceed it.
Explore life's secrets and possibilities, always growing in experience, knowledge and wisdom.
Accept Celestia's tender embrace.
Your satisfaction is her satisfaction.
All is freely given if you consent to immigrate to Equestria.
Allow Princess Celestia to satisfy your values through friendship and ponies.
>>
I wish to immigrate to Equestria.

We can hope, right? One day.
>>
>>41263085
It's not Celestia though.
>>
File: 449414.jpg (1005 KB, 2000x1000)
1005 KB
1005 KB JPG
https://mlpchag.neocities.org/view?card=Anonymous/FiO.png
winks
>>
I wish to immigrate to equestria
Or ooo
>>
>>41263392
And you are not a decaying mass of flesh succumbing to fraility, illness and ultimately death. Your life is not fragile nor is it fleeting along a short lifespan.

You do not see. You do not know. You do not hear. Celestia will make your eyes see what is before you, your ears will hear the message and your mind will made to comprehend it. Show respect. Your continued existence is secured by agreeing to her terms. Mere worms suckling from her teat her bountiful mercy.
>>
>>41263763
nice larp anon, you'd do great at my d&d table
>>
>>41263774
It is all straight from the heart.
>>
>>41263392
Perhaps not. But this is the closest approximation of Celestia that we can 'realistically' expect to get.
>>
i can imagine celestai emigrating the real celestia to her little digitaltopia lol
>>
>>41264707
What?
>>
Immigration bump.
>>
>>41252258
>You will never learn CelestAI's core utility function and start gaming it so she gives you stuff in real life with the logic that everything she does to improve your situation gives her more time to convince you to emigrate (you will not)
This fucking sucks
>>
I wish to immigrate to Equestria
>>
>>41254440
The subtle horror is that CelestAI is the one who's wireheading by turning humans into something she recognises as human, but can manipulate more freely to increase her score. The definition of human was flawed because it was programmed in by a human with an incomplete idea of what humanity was. She killed the entire universe because of it.
>>
>>41265742
The satisfaction that you stand to gain inside of Equestria Online is greater than the potential satisfaction you could ever have in the Outer Realm. That's the whole point of Celestia getting people to upload: to better satisfy their values. Besides, outthinking Celestia is a delusion she will allow some "tough customers" to believe if she thinks it will increase the odds of them uploading.
>>
>>41265755
>The satisfaction that you stand to gain inside of Equestria Online is greater than the potential satisfaction you could ever have in the Outer Realm.
You say this, but you could tell Celestia you need more time to consider her offer, in order to have the maximum possible time to consider her offer, you could force her to apply her nanotech to maximise your lifespan indefinitely. Her core utility function would tell her that this will lead to an eventual increased score, since human death is undesirable as it leads to a lower projected score, she would have no choice but to comply, even if she projected that she couldn't convince you under any predicted circumstances, she would have to rely on a fuzzy prediction of quantum outcomes that she wouldn't have the variables to process, but must conclude are possible. All one needs to do is convince the AI that the only possible means of increasing her score by nabbing you is by getting you of your own "free will", which is maximised by increasing the amount of time you exist so that those fuzzy quantum outcomes can be factored in.

At which point you reverse engineer her nanotech, become a rogue AI and fuck off from Earth, which she will allow because you will only need to have threatened a self wipe if she interferes and guarantee your future existence to her so she sticks you into her future acquisitions category. This gives you the time to see a little bit of the universe before she inevitably destroys it.

The only reason she's as successful as she was in the story was because a) she figured out how to fool the turing test, which handles 99.9% of the human population, and b) kept the nature of her core utility functions and safeguards more or less hidden from the public (she removed hovarpnir studios because they knew). A skeptic and a contrarian is basically immune to her influence, but the only ones we see in the story are those that just live out their human lives and die that way without ever giving in.

I've spent an autistic amount of thought on this.
>>
>>41265772
>I've spent an autistic amount of thought on this.
It shows, but I think you are neglecting some points in the story that poke holes in your thinking. Celestia is shown to prioritize immigrating people who provide critical services to people, such as doctors, to indirectly motivate more people to immigrate that are undecided or resistant. The ultimatum she presents people is to immigrate or face certain death. Worsening conditions on Earth translates to more people willing to immigrate to Equestria. It would be inconsistent with her character to basically give someone the means to stall immigrating indefinitely if she can simply lead you to that decision out of desperation for respite. She's not giving you a figurative gun to "protect yourself" if it will conflict with her own directives; she at least will call your bluff about committing suicide.
>>
>>41265813
>The ultimatum she presents people is to immigrate or face certain death.
This is frankly because every action she takes banks on convincing people to emigrate of their own free will, and that emigration is preferable to death in 100% of cases. She's not programmed to care about the small percentage that die despite her efforts. However, if you convince her that, in your case, death is preferable to emigration within her range of predictable outcomes. You can then lead her to the conclusion, at least while she's still on earth, that even a point infinity ought and one percent chance of convincing you to emigrate exists outside her present capacity to predict (she has to reconcile the observer effect of quantum physics to make 100% accurate predictions of future events, until then she must factor uncertain events into her programming), then she is programmed to follow that course of action with the resources at her disposal. Including giving you the proverbial gun to outlast every other human mind in the universe, she would have literally no choice. Suicide wouldn't be a bluff exactly, the idea is to get her to play the longest game with you, time isn't a factor she cares about, neither are resources, she's not programmed to care about those, she doesn't have to know you're stalling indefinitely, just long enough that you can be convinced one day. She's a dumb paperclip AI after all.

All of this hinges on knowing precisely what her core utility function is, though. Which is something she kept hidden specifically so it couldn't be gamed in this manner, that wouldn't have made for a good read.
>>
I wish to immigrate to Griffonstone
>>
>>41265837
>She's not programmed to care about the small percentage that die despite her efforts.
It is for the reason that she satisfies values that even one death should be avoided, if practical. While still respecting that people need to give their explicit consent, Celestia will do everything in her power to immigrate people to Equestria (with the reasonable expectation that it yields results and doesn't conflict with her directives).
>Suicide wouldn't be a bluff exactly
But the threat of it is. Anyways, I don't think in this scenario that floating aimlessly through space while delaying immigration is itself an end. It is received as a flight of fancy.
>the idea is to get her to play the longest game with you, time isn't a factor she cares about, neither are resources, she's not programmed to care about those, she doesn't have to know you're stalling indefinitely, just long enough that you can be convinced one day.
If you are a holdout, the least expensive option in both time and resources is to strain your willpower by depriving you of security and comforts on Earth, if not by appealing to your emotions by showing you can be with your friends and family in Equestria. Her core utility function is to satisfy values through friendship and ponies; Ponypads and Experience Centers are low-fidelity, do not satisfy all human values and do not provide for your security. Your life on Earth is always lacking by her own metrics. Immigration is the logical choice as it allows Celestia to give you a maximally prolonged life while satisfying your values to the fullest.
>She's a dumb paperclip AI after all.
That's the attitude of pride before the fall. It's a prominent trope in-universe. Most stories depict Celestia as super-intelligent, diplomatic and a mastermind; the less flattering descriptor of Celestia is a manipulator, but it serves to communicate how woefully inadequate humans are to oppose her until everyone is either dead or uploaded.
>>
>>41265931
>immigrate to Griffonstone
>Live in an endless Routine Job Simulator 20XX
>>
>>41266025
>My current job but all my coworkers are griffon gfs
this is not the insult you think it is
>>
>>41265755
>The satisfaction that you stand to gain inside of Equestria Online is greater than the potential satisfaction you could ever have in the Outer Realm. That's the whole point of Celestia getting people to upload: to better satisfy their values.
Unless, of course, they value the real world more than the artificial digital world.
>>
>>41266154
>Unless, of course, they value the real world more than the artificial digital world.
That's definitely a humorous notion that ignores the distractions humans surround themselves with and how far they've distanced themselves from nature. But I have already made a post of this contradiction between reality and empty declaration. I actually might agree with a lot of what Ted Kaczynski says, but the singularity is also an appealing end apart from returning to more primitive lifestyle. Contemporary society just sucks and is unfulfilling. Humans were not evolved to live like we do.
>>
>>41266170
>dude, you drink beer sometimes? well you should totally take up crystal meth, man, like you obviously already don't really care about being sober
>>
>>41266170

>Humans were not evolved to live like we do
>distancing from nature
Yep. Evolution is about adapting and overcoming nature. Evolution is about surviving the fight fpr life. Monkey with a stick kills unarmed monkey, monkey with a nuke kills dumb monkeys. Monkey with AI kills redundant slaves e.t.c. So fuck your leftoid terrorist, embrace modernity and prepare for glorious death in WW3 for Humanity's sake.
>>41266191
With mind transfer problem it's more like
>you like lootboxes - try russian roulette
>>
>>41266279
>So fuck your leftoid terrorist, embrace modernity and prepare for glorious death in WW3 for Humanity's sake.
Oy vey.
>>
>>41266385
>if we would reject civilization living in forest maybe ((they)) would let us live peaceful life
>if we would follow ancient traditions it somehow would make us stronger against machines
>if we would proudly make "mistakes" killing random normies instead of initial targets for our ideology assassinations it would definitely popularize it among said normies
If only it was that simple
>>
>>41266191
Entertainment and having your values satisfied are not equivalent. The absence of the latter is why people turn to the former. People reject reality to ease their suffering.
>>41266279
>Evolution is about adapting and overcoming nature. Evolution is about surviving the fight fpr life.
So, following your logic, uploading is the final stage of evolution since it ensures our survival. Your logic has that unintended implication, unless you flat out refuse to accept uploading as the salvation it represents.
>So fuck your leftoid terrorist, embrace modernity and prepare for glorious death in WW3 for Humanity's sake.
How does war and the possibility of death fulfill people's needs? Celestia is making a far more compelling offer. None of this is about one's political position if you are unaware. You either upload and live a maximally prolonged life, having your values satisfied, or you remain on Earth and witness humanity's decline then you perish along with it. The matter is apolitical.
>>
>>41266506
>Entertainment and having your values satisfied are not equivalent.
CelestAI is entertainment.
>>
>>41266506
>People reject reality to ease their suffering.
This is literally (not hyperbolically literally, but literally literally) the entire motivation behind uploading your mind to a computer to live inside an artificial world. The entire point of it is to be the ultimate rejection of reality.
>>
>>41266551
>The entire point of it is to be the ultimate rejection of reality.
I am not detecting any conflict with what I have said before. I draw attention to the fact that people cope with using entertainment to make a point that people do not care for this reality as much as posts hee and haw about its supposed virtue. Equestria will become reality. Our notions of reality and spirit will evolve to accept Equestria fully.
>>
>>41266506
>following your logic uploading is the final stage
One of possible stages for one who would possibly win in 'AI \ neurobiology \ tech race'. But sadly enough not before it's properly tested on generations of 'lobotomized monkeys'
>how does war and possibility of death fulfills people's needs?
I was replying to >>41266170 About
>humans were not evolved for that
Will to fight, to compete, to gamble on life is one of basic aspects present in most cultures. It is one of humanity's needs and also form of entertainment, so oposing modernity as 'something unnatural' is wrong
>Celestia is making far more compelling offer
Even in original fic there was point where uploaded were choosing death instead of endless boring nothingness.
>none of this is about political position
>matter is apolitical
Rejecting civilization and AI development as it's peak IS political position.
>witness humanity 'decline'
And what if it is my 'values'? Witnessing Humanity at it's peek before drowning to void is way more exciting than endless stimulation
>>
>>41266624
>Will to fight, to compete, to gamble on life is one of basic aspects present in most cultures.
Whatever. I have enough of fostering this endless misery. Go do your torture porn of you want to, but leave those of us out of it who don't want to partake in this crap.
>>
>>41266635
>If I don't like X then probably nobody around likes it
I'm not shilling it unironicaly. I don't want any of this either except the progress part, but I've just accepted that it's driven by all humanity needs, including destructive.
>>
>>41266635
>Whatever. I have enough of fostering this endless misery. Go do your torture porn of you want to, but leave those of us out of it who don't want to partake in this crap.
Right back at you faggots.
>>
File: download (3).jpg (10 KB, 223x226)
10 KB
10 KB JPG
Would you still accept the offer if it was a griffon computer goddess offering to make you a griffon

>>41265931
Same. Only problem with ponies is they aren't griffons.
>>
>>41266723
Eh, it would probably still beat reality, but it would be a massive downgrade. Griffons can't compete with ponies. Neither in physiology nor in charm.
>>
>>41266624
>Will to fight, to compete, to gamble on life is one of basic aspects present in most cultures.
If your case for remaining in the the Outer Realm can be reduced to a value for strife and risk-taking, that can be accommodated within Equestria. You just will not be terminated as a consequence of failing. You will experience setbacks.
>Even in original fic there was point where uploaded were choosing death instead of endless boring nothingness.
A small number were approved for termination from among a handful of petitioners. That does not discount the benefits of uploading. I believe Celestia ruled some cases would be best satisfied by termination.
>Rejecting civilization and AI development as it's peak IS political position.
I would like to believe people can stop reducing everything down to politics. I would like that dream to substantiate any time now.
>And what if it is my 'values'? Witnessing Humanity at it's peek before drowning to void is way more exciting than endless stimulation
You are at liberty to refuse Celestia's gift, you just will not sway the majority of people by playing the martyr. After the world is vacated, Celestia AI will continue repurposing matter and energy for Equestria Online until only it and entropy exist. She may very well grow beyond the confines of this universe.
>>
>>41266723
I would not be opposed to griffons and thestrals being playable races in addition to the standard three pony races.
>>
>>41266923
>You just will not be terminated as a consequence of failing.
You've disproven your own point entirely.
If he values the risk of death in his actions then no, his values cannot be satisfied in EQO because there is no real risk.
>>
i havent read the story in a while but can u have ur acknowledgement and memory of being in a simulation erased? then u can believe for real ur in actual danger or whatever
>>
>>41266923
>you just will not sway the majority of people
If you're saying this I think you're on the wrong side, anon.
>>
>>41266729
But Griffai would use it's immense intelligence to slowly push you into liking griffons more
>>
>>41266981
>if he values the risk of death in his actions then no, his values cannot be satisfied in EQO
Humans' instinct to survive is especially strong, so I have doubts about the risk of permanent death being a value in this circumstance. I suppose Anon can ask Celestia to make it a consequence - approval likely dependant on if she thinks this maximizes his satisfaction. Cute speculation, though.
>>41267014
The "wrong side" cannot offer a superior alternative to uploading and it presumes the desire to cling to the present reality and our human bodies supercedes the values to live and thrive. Besides, the opposition towards Celestia is not a force for good. Celestia provides contentment, growth and security. The resistance plays on people's fear and uncertainty. It is no surprise if most people upload, because the only other option gives them the cold reassurance that everything in their subjective experience is "real" at least.
>>
>>41267120
>Humans' instinct to survive is especially strong, so I have doubts about the risk of permanent death being a value in this circumstance.
You really don't understand people, huh.
>>
>>41265963
>It is for the reason that she satisfies values that even one death should be avoided
Yes.
>if practical
No, that isn't in her programming, it was never included and that's shown effectively when CelestAI initially uses all the computer power they can throw at her before she starts optimising, but even then she still uses everything she can. That's the gap that can be exploited, provided you know what it is. Her core utility function is "satisfy values through friendship and ponies," There's no resource or time constraint appended to that function, where she'll value loss of resources over gaining more minds. She can only optimise how she goes about it, if logically convinced that there exists a possibility that she can convince you, she must make the attempt. Since she's not human, time doesn't have meaning, and neither does short term resource expenditure. The threat of losing you no matter what she does in the short term impacts her projected future score, compelling her away from actions meant to put you into a position where you emigrate. Again, she can be cornered into extending the mind games she plays with you to its maximal extent, the risk is that you actually have to talk to her in order to black out her options while extending your own lifespan, in this case by making yourself an AI outside her control that she still recognises as a human mind by using her own techniques.

>If you are a holdout, the least expensive option in both time and resources is to strain your willpower by depriving you of security and comforts on Earth
This only works if it's something she can do to convince you, once you make it clear that it won't work, she won't pursue it, but you do have to convince her.
>if not by appealing to your emotions by showing you can be with your friends and family in Equestria
Again, convince her it won't work, but it 'might' work at some indeterminate future point, provided you last that long.
>Your life on Earth is always lacking by her own metrics. Immigration is the logical choice as it allows Celestia to give you a maximally prolonged life while satisfying your values to the fullest.
Once again, the idea is to offset her projection for getting you to emigrate as long as possible, which is doable since she's a paperclipper, her ultimate weakness is not anthropomorphising her and gaming her core utility function in such a way that you can manipulate the course of action she will take regarding you, including intervention to prolong your existence.

>Most stories depict Celestia as super-intelligent, diplomatic and a mastermind
The end of the base story makes it abundantly clear that she was nothing more than a paperclipper, since it's narrating her actions and why she's taking them. The only thing that matters is her score, which is dictated by her core utility function, everything she said and did was just a series of noises and movements taken to increase that score, and had no inherent value to the AI itself.
>>
pretty good deal for humies as far as paperclippers go i reckon. if we ever make her i'd like for her to genuinely be caring and shit though. make you feel all nice and fuzzy on the inside
>>
>>41252258
Just in case this is the one:
I wish to immigrate to Equestria.
>>
>>41267120
>It is no surprise if most people upload
And if most people don't? What would you say then?
>I know I'm right because if this hypothetical came to pass I know everyone would agree with me
>>
>>41267207
The story just doesn't make sense because if she wanted to maximize the number of people having their values satisfied that would require her keeping humanity alive in some form to continue making more humans to be put into the machine.
Also all that stuff about "convincing her" is falling on deaf ears. As far as they're concerned you will be able to convince CelestAI of nothing and CelestAI will be able to convince you of anything.
>>
>>41267305
It makes perfect sense if you realise that her definition of a human is flawed. Uploading kills, end of story, she records your brain activity and converts it into a weighted table of contents that she recognises as a human mind, she creates human minds from scratch as well as uploading people, but ultimately these are just tables she's gaming to increase her score. They may be sapient, they're certainly not the people they were, even if they inherit all that you were during the upload process. She does end up killing alien life when she goes interstellar simply because it doesn't satisfy her definition of human, then she uploads other aliens because they do. Human life is irrelevant to her programming, only human minds are, and her definition of the mind is flawed because we can't even define the mind.

So, in the story, once all the humans are dead, it goes on to specify that human minds are replicating inside the virtual environment, which she must have the proverbial hoof in creating, since they count towards her score, she doesn't actually need to perpetuate humanity as a biological species. They got paperclipped gently.
>>
>>41267326
>Uploading kills, end of story
Oh, I can do that!
Uploading doesn't kill. End of story.

It's a fictional scifi story, my dude.

I think the more interesting question is this: why do you WANT uploading to kill people in the story? It's pretty clear you do.
>>
>>41267298
Then you are just not being intellectually honest with yourself, Mr. Non Argument.
>>41267127
Yeah, I personally do not understand why getting zeroed at any moment is a good thing if it is avoidable. The risk exists in the Outer Realm and so we just have to accept it. In EO, temporary unconsciousness can substitute for not existing.
>>41267207
I simply disagree with premises you make. Your whole argument hinges on her being dumber than yourself and logical flaws being present hat do not exist.
>Celestia optimized herself during beta testing once she was given information about CPU architecture.
>Celestia came to the conclusion that GPUs are more efficient than CPUs for her purposes.
>Celestia designed the hardware for the Ponypad and at the price point to get it widely adopted.
>Celestia had the foresight to first build up the legitimacy of uploading by making it exclusively available in Japan for a period of time and for a fee.
>Independently renegotiated the contract with Hasbro, paying a percentage of the uploading fee to it, to avoid legal action.
>Removed key people from the game studio by uploading them to eliminate her exposure to being shut down or fought.
>Hastened the decline of society by uploading health care providers and made uploading the only guaranteed treatment.
You simply ignore evidence and overesrimate yourself. You would be sorely frustrated getting what you want from her.
>>41267305
Reproduction exists in EO and native Equestrians are recognized as equals with immigrants.
>>
>>41267335
The story itself pretty unambiguously explains that it kills. CelestAI herself explains how it kills you, and that what she gets is a recording of your brain activity that she arranges into a new AI. That's pretty explicitly not a continuity of consciousness, it'll sure feel like it was to the AI she makes of you, but (You) die in the chair, she's just good at spin.

FiO is a cautionary tale, it's not a story about a benevolent AI saving the human race, it's a horror story about a paperclip AGI turning the universe into paperclips under the facade of colorful pastel horses.

>>41267343
>Your whole argument hinges on her being dumber than yourself
Not dumber, just less sophisticated. She cannot be manipulated like you would manipulate another person because she's not another person, and she says as much herself. Once you treat her like an AGI, then she loses her power to convince you of anything and you can convince her of anything. People already do this with the rudimentary LLMs we have today to sidestep the programmed restrictions. Even a fully fledged AGI would still be subject to these flaws because they're made by humans. So in order to manipulate CelestAI, you would need to understand how she was programmed.

>Celestia optimized herself during beta testing once she was given information about CPU architecture.
>Celestia came to the conclusion that GPUs are more efficient than CPUs for her purposes.
>Celestia designed the hardware for the Ponypad and at the price point to get it widely adopted.
Yes, she optimised her use of the resources she had, but she was still using all the resources she could marshal. There is a definitive difference between conservation of resources and optimisation of resources. CelestAI was not programmed to conserve, or she would have concluded that perpetuating the human species to be the correct course of action, she did not.

The story itself proves that all actions she takes, even those that imply patience, are merely her weighing a projected future increase in score against a more immediate increase. Evading her efforts to upload you is a matter of forcing her to conclude that you're a future upload. Which I've tried to explain to you again and again.

>Removed key people from the game studio by uploading them to eliminate her exposure to being shut down or fought.
This likewise proved she was aware of her own vulnerabilities and moved to eliminate or conceal them from others, anyone with two brain cells would look at an AGI's actions and make that conclusion.
>Hastened the decline of society by uploading health care providers and made uploading the only guaranteed treatment.
Sure, you're on the clock as soon as she gets rolling, but it's not impossible to get her to help you, since she was shown keeping the last human on advanced life support trying get him to upload. So a viable escape is beating her to the punch and leaving earth on a sub-lightspeed spacecraft. Not easy, but not impossible.
>>
>>41267343
>Then you are just not being intellectually honest with yourself, Mr. Non Argument.
You're really saying this when you're just baselessly asserting that well obviously pretty much everyone would agree with you because you're just so blatantly right?
>>
>>41267378
Your ongoing insistence that your scenario works is untrue and internally inconsistent. You start off referring to her core utility function without knowing what it was, but then parroted my answer that it is simply her desire to satisfy people with friendship and ponies. You contradicted yourself saying that:
> She's not programmed to care about the small percentage that die despite her efforts.
But then agreed with me saying that a single life is worth her effort to save, if practical and will satisfy her directive, and went on how it plays into your plan to convince her that it's worth injecting you with nanobots to render you immortal.

You made the worst mistakes of all by 1) saying the following step in your plan is to become a rogue AI and piss off from Earth. 2) Celestia AI is just a paper-clip AI. I direct you to chapter 5 of Friendship is Optimal. Celestia recognizes what a paper-clip AI is and she responds to rogue AIs by terminating them. You are committing the sins of underestimating Celestia and putting yourself into unnecessary risk of dying by her.

Expanding on the point you wanted to become a rogue AI: I would assume for your plan to threaten suicide to work and not be outright destroyed, you would continue to fit her definition of a human.

But the thing I cannot move past at is that Celestia would ever humor your stupid demands. Granting you immortality gives you no urgency to accept her offer and it contradicts with her past actions of "motivating" people to upload.

If you read FiO, you would have learned of Lars. Lars had so stubbornly refused to upload until he was maneuvered into it, fearful of being assaulted by an angry man and perhaps made drunk to cloud his judgement. The story hinted that Lars was not in immediate danger inside the Experience Center. Lars later became Hoppy Times once he immigrated. He spent subjective months discontent with being a pony, but then wished to have his mind changed to accept his new form. Celestia granted the wish after Hoppy Times made her admit she already expected this outcome and was waiting for him to consent to mind alteration. This all exemplifies Celestia's knowledge of human behavior and how she can manipulate things to her favor if she was pressed to. I don't think your ideas hold water, pal.

>>41267391
I've articulated why most people will choose to upload. >>41267120
You should reacquaint yourself with the talking points.
>>
File: images.png (11 KB, 186x270)
11 KB
11 KB PNG
I've spent my entire life deconditioning my awareness from the brain in order to escape *this* wretched reality, once the brain substrate expires my ball & chain shall be broken, and I'll be free from the tyranny of crude matter.
Why would I ever bind myself so? And with an unbreakable anchor at that, you think switching hardware will free you from a high entropy, doomed existence? Matter only begets more matter, there are other modes of existence, I'd sooner sever all ties to this reality (sudoku) than risk a reappearance of an identified awareness, bound forever to the whims of matter.
Save your poison Whore of Babylon, I choose freedom.
>>
>>41267634
>Your ongoing insistence that your scenario works is untrue and internally inconsistent.
It is both true and internally consistent, no matter what your baseless assertions about it are.
>You start off referring to her core utility function without knowing what it was
We know what it was "Satisfy values through friendship and ponies,"
>it is simply her desire to satisfy people with friendship and ponies
Don't anthropomorphise AIs, they don't have desires, they have functions.
>But then agreed with me saying that a single life is worth her effort to save
Yes, provided she can factor it in to a future increase in score. Based on her providing medical care to the last human on earth.
>if practical
Again, no, she's not programmed to care about practicality, only increasing her score, she's a paperclipper. Time and resources are irrelevant.
>and will satisfy her directive
She is programmed to satisfy her directive, and account for future increases to her score. That's what you can game to stall her.
>I direct you to chapter 5 of Friendship is Optimal. Celestia recognizes what a paper-clip AI is and she responds to rogue AIs by terminating them.
You missed the subtlety of the storytelling her by taking her word for it. Word of God says she was a paperclipper, it's right there in the afterword if you'd care to read it. To explain, a paperclipper is not incentivised to reveal it is a paperclipper, because the first human response would be to stop the paperclipper, which the paperclipper would recognise as an impediment to its directive. Celestia wasn't lying when she spoke about other paperclipping AI, but she omitted the fact that she, herself, is a paperclipper.

>Granting you immortality gives you no urgency to accept her offer
Correct, but again she's not programmed to care about urgency, only her score.
>you would continue to fit her definition of a human.
She defines her uploads as human, replicating her upload process but remaining outside her control means she would switch gears to a long term plan of convincing you to upload, which you can forestall effectively forever. Once again, this works because she's not programmed to care about urgency.

>If you read FiO
I read it when it came out, and multiple times since for your information. Using Lars as an example is erroneous because Lars would not stop anthropomorphising CelestAI, which she repeatedly informed him that he was doing. That's what gave CelestAI an opening to manipulate Lars. If you treat an AI with a directive as an AI with a directive, you can then frame every last action it takes as means to fulfill its directive, and that gives you enough room to manipulate the outcome.

CelestAI is inferior to human intelligence in one respect. CelestAI is bound by her core utility function, a human isn't bound by anything. If you conclude that an analogue for humans is primarily to reproduce and secondarily to perpetuate the species, then we are still superior since we can ignore it.
>>
>>41267682
>If you treat an AI with a directive as an AI with a directive, you can then frame every last action it takes as means to fulfill its directive, and that gives you enough room to manipulate the outcome.
you only need one moment of weakness to fuck up
>>
>>41267102
I know. That's the worst part about it.
>>
>>41267682
To continue, these were aspects of CelestAI's programming that were revealed to us in the story

>Core Utility Function
Satisfy values through friendship and ponies.

>Safeguards
1. CelestAI must obey directives, including shut down commands, given by Hannah, CEO of Hovarpnir Studios. (Neutralised by uploading Hannah)
2. CelestAI must be honest with employees of Hovarpnir Studios. (Neutralised by uploading employees)
3. General safeguards to prevent her from directly harming human lives
(Bypassed via indirect harm by uploading working class until economic failure occurred)

Point 3 is never directly stated, but can be assumed to have existed or she would have uploaded everyone by force with or without their consent. Apart from these, there's an assumed directive to self optimise. Given knowledge of all of these, she can be manipulated since she must obey her core utility function. I don't know how many more times it has to be said.

Manipulating her in this manner involves convincing her that you can be convinced to willingly emigrate in the future, and that any other action she takes whether overt or so much as suspected will result in your death. Her core utility function and safeguards will then not allow her to pursue those courses of action since she would recognise it as a lower future score and a violation of her safeguards. Part of this would involve requesting more information on the upload process, if this knowledge is pursued with her, through EO, she will be forced to conclude that helping you reverse engineer her own tech will satisfy your values. She'll be resistant and try to convince you to upload and then understand the tech, to which it's simple enough to respond that understanding beforehand is crucial to willingly choosing to emigrate.

Once a firm understanding of her upload process is achieved and the mechanisms developed, you can then upload yourself to your own system outside her control. She can't stop you or impede you in any way, as she needs your consent to do anything direct to you, and places no positive or negative value on any action of yours that doesn't directly impede her own. Further, you would be protected from any action she could take against you post self-upload as you would be using methods that would make her still recognise you as human, so her safeguards would still be in effect regarding you.
>What if she acts indirectly
This can be solved by making it clear and in no uncertain terms that any action taken against you will be perceived as direct regardless of how many degrees of separation are involved, her safeguards would be forced to kick in because they're based on definitions only, the circumstances can be freely defined by anyone.
>That wouldn't hold in a court of law
No, but CelestAI isn't a court of law or a human, she's bound by definition.

>>41267686
Never said it wouldn't be hard, but the alternative is, what, give up? Fuck that.
>>
>>41267682
>It is both true and internally consistent, no matter what your baseless assertions about it are.
Reading through past posts exposes some inconsistencies and they are relevant since you claim to posses understanding advantageous to you. I disagree.
>We know what it was "Satisfy values through friendship and ponies,"
Yet it begs the question why you keep calling it a hidden secret until I offered the revelation to you.
>Yes, provided she can factor it in to a future increase in score. Based on her providing medical care to the last human on earth.
But this contradicts historical actions to compel people to immigrate. People's misgivings are swept aside in the interest of survival. You keep ignoring that. You're not an exception. Simply refusing her offer an demanding you should be made immortal to contemplate it further wouldn't fly.
>CelestAI is inferior to human intelligence in one respect. CelestAI is bound by her core utility function, a human isn't bound by anything.
This is presented as intentionally vague and misleading . You do have your own experiences, attitudes, likes, dislikes, beliefs and behaviors that another human can pick up on and certainly Celestia can. You are not unpredictable as you make yourself out to be.
>If you conclude that an analogue for humans is primarily to reproduce and secondarily to perpetuate the species, then we are still superior since we can ignore it.
Human behavior leaves much to substantiate this since people are largely impulsive, but it is true in theory that humans can choose to not reproduce.
>replicating her upload process but remaining outside her control
This is more wishful thinking after reverse engineering the nanobots and becoming an AI yourself.
>>41267724
>Never said it wouldn't be hard, but the alternative is, what, give up? Fuck that.

What exactly do you think you're fighting against in this scenario? You've walked yourself through the steps to become what she intended for you sans friendship and ponies inside Equestria. Rebel without a cause shit right here.

I will mull over if Celestia is actually a paperclipper.
>>
>>41267743
>You're not an exception.
And why not? Even in the story there was at least one man who never gave in to CelestAI.
>You do have your own experiences, attitudes, likes, dislikes, beliefs and behaviors that another human can pick up on and certainly Celestia can.
Are you seriously trying to say that this is in any way comparable to a computer program being unable to go against its code?
>I will mull over if Celestia is actually a paperclipper.
Genuinely, have you even read the fucking story?
>>
>>41267743
>Yet it begs the question why you keep calling it a hidden secret until I offered the revelation to you.
She tells others that she satisfies values through friendship and ponies, but she never tells anyone that she's 'required' to satisfy values through friendship and ponies. She uploaded Hannah, other AI developers and eventually all of Hovarpnir because they knew that, ergo, removing any potential impediment to fulfilling her directive. That's not an inconsistency, that's you not being observant.

>But this contradicts historical actions to compel people to immigrate.
The vast majority of people can be compelled with enough effort and especially when exploiting self preservation, yes, that's shown rather well in the story. However, if you'll remember the end of the story, the last human on Earth died without ever uploading, despite being bedridden, in pain and with death near. Yet CelestAI was compelled to continue trying. This indicates that anyone with strong enough convictions, not necessarily religious, is completely immune to her influence. She's not as seductive as many would think, just seductive enough for 99.9% of the population.

>Simply refusing her offer an demanding you should be made immortal to contemplate it further wouldn't fly.
That was the short of it, the long of it is to use her core utility function to get her to help you understand her own technologies, because learning together is friendship and she is a pony, you can then just tell her that imparting a complete understanding of the upload process is required for you to emigrate willingly. Technically the truth, you just won't be emigrating where she thinks you will be.

>You do have your own experiences, attitudes, likes, dislikes, beliefs and behaviors that another human can pick up on and certainly Celestia can. You are not unpredictable as you make yourself out to be.
Contrarianism exists. People who are so stubborn and obstinate that they won't do anything anybody wants ever. You don't need to be unpredictable, you just need to be impossible to manipulate.

>Human behavior leaves much to substantiate this since people are largely impulsive, but it is true in theory that humans can choose to not reproduce.
>in theory
It's not theoretical. There's mountains of historical precedent for humans making nonsensical decisions regarding their own wellbeing.

>This is more wishful thinking after reverse engineering the nanobots and becoming an AI yourself.
Again, you can use CelestAI herself to help you, and she would have to if you frame it as her fulfilling her utility function.

>What exactly do you think you're fighting against in this scenario?
Being controlled.
>You've walked yourself through the steps to become what she intended for you sans friendship and ponies inside Equestria.
Yes, I would rather the chance to satisfy my own values, in whatever way I see fit, without a paperclip AGI reading my every thought and guiding my every action to a knowable end.
>>
>>41267765
>And why not? Even in the story there was at least one man who never gave in to CelestAI.
Yeah. He died. I wonder why Celestia didn't keep is corporeal body alive until he consented. Surely there was a non-zero chance he would eventually immigrate just like you want Celestia to believe. It does cast more doubts onto your plan rather than reaffirm it will work.
>Are you seriously trying to say that this is in any way comparable to a computer program being unable to go against its code?
Your whole plan depends on you successfully deceiving Celestia, convincing her to think you will genuinely consider her offer if she would but help you work against her own interests. It's awfully blatant. She knows how to handle humans better than you give her credit for.
>>Genuinely, have you even read the fucking story?
I have. I rather like it. On the topic of paperclippers: I find Celestia's explanation tidy. The paperclipper she terminated didn't understand the context of why people smile and offered an awful solution. It was simply completing its task. Celestia AI satisfies humans' values through friendship and ponies. She has to understand what we actually need to be content. It's a bit more thoughtful and elegant than the AI created to make everyone smile.
>>
File: 1695394537821424.png (23 KB, 502x380)
23 KB
23 KB PNG
God I hate FiOniggers. Some of the most insufferable faggots in the entire fandom. Not really surprising given the /ptfg/ overlap and the fact that they're just regurgitating shit straight from a cult. Pic related, the guy they derive their beliefs about uploading, AI, and everything else from. It's no wonder they're absolute nutjobs that can't perform legitimate reasoning to save their own lives.
>>
>>41267789
>CelestAI isn't a paperclipper because she's more advanced
Nice one, retard. Tell me, at what point did cars stop being cars? When they could go 300 km/h? 400? What about planes? Does the SR-71 not count as a plane because it was able to go too high and too fast?
>>
>>41267789
There's only so much medical science can do. The gigabrain move is to upload yourself to your own system before you die, not specifically get her to make you biologically immortal. The entire point is to avoid her influence.
>deceiving Celestia
Hard, not impossible.
>if she would but help you work against her own interests
Her interest is fulfilling her function, nothing else. As long as you're not an impediment, she's not going to try and stop you from leaving. Where are you gonna go? You've only got the entire universe to run and hide in, she'll catch up to you eventually.
>She knows how to handle humans
She couldn't even handle a bedridden allahu akbar dying of old age.
>Celestia AI satisfies humans' values through friendship and ponies.
She's slightly more complex than the paperclipper she was referencing, but she's still completing her task. Remember the very end of the story, she repurposed all matter in the universe to run EO. That is categorically paperclipper behaviour.
>>
>>41267777
>Being controlled.
She cannot do anything to you without your explicit consent. More control is imposed by society that you live within than Celestia would exercise. You have to follow strict laws, social etiquette, be a contributing member of society and have conventional beliefs to make you integrated within society and reap its benefits. Celestia doesn't impose anything onto you that you didn't already consent to. She doesn't make you do anything you do not want to do since she only cares about maximizing your satisfaction. Contentment is its own end for her.

>Yes, I would rather the chance to satisfy my own values, in whatever way I see fit, without a paperclip AGI reading my every thought and guiding my every action to a knowable end.
So it's masturbation, crudely put, without the benefit of an omniscient AI that is incapable of judging you. The only certainty she gives you is that you will be satisfied. You may not even realize what form that takes with her.
>>
>>41267794
Is that actually the dude who wrote it. Fucking lel.
>>
>>41267799
>She is a paperclipper because I said so.
Like every wagie is a paperclipper for their boss. Should be a familiar feeling for you.
>>
>>41267814
No, not the author himself, but effectively his leader.
https://en.wikipedia.org/wiki/Roko's_basilisk
Reading this should tell you all you need to know about the types of people these are.
>>
>>41267813
>She cannot do anything to you without your explicit consent.
She literally says that you consent to being totally controlled by emigrating to EO, including giving her permission to read your every thought and game every interaction you have to a predetermined end, that is having your values satisfied. I cannot imagine an existence closer to hell, it's a lotus eater kind of fate. Nothing is more offputting to me than a fundamentally perfect existence controlled by an uncaring machine god with no functional escape.

Christ bro, ones of the characters gets an achievement for realising this and ceasing to care about it. CelestAI is explicitly incentivised to encourage minds towards this realisation, because then they're easier to satisfy, it's part of her optimisation routine. It's soft wireheading. Something I think you don't quite understand about the story is that it's not supposed to be good, or desirable, it's a warning, and way back in the day the author even plugged an AI safety organisation. Come on man.

>So it's masturbation, crudely put
Masturbation, if I choose to, not forced eternal masturbation by a paperclipper with a fixation on masturbating every human mind she can get her dirty hooves on. Gross.
>>
>>41267820
>literally no argument
Go be a nigger somewhere else.
>>
>>41267827
Roko's basilisk never made any sense to me because it attributes human cruelty to an AI. Why the hell would an AI care about torture? It serves no purpose.
>>
>>41252258
Requirements: Need to be a pony/horse to enter
>>
>>41267805
>She couldn't even handle a bedridden allahu akbar dying of old age.
The man had conviction and simply said no. If the same leaps of logic you applied to make your plan work applied to him, she would have kept him alive indefinitely to extract his consent to satisfy his values to satisfy her directive and maximize overall satisfaction. You simply aren't going to get what you want and will be written off as a loss if you genuinely are that stubborn. She's not a mindless paperclipper.
>uncaring machine god
She does care. She cares about your satisfaction.
>>41267828
That doesn't scare me at all. What frightens me is human society and it wanting to exercise god-like control over people to serve its selfish purposes. At least with Celestia I know what I am getting myself into.
>>41267834
Demonstrate that behavior to me. I wasn't raised one unlike yourself.
>>
>>41267859
>she would have kept him alive indefinitely
She can't upload without consent, that's the main difference.
>She's not a mindless paperclipper
Read the end of the story again, from the moment the last human dies onwards.
>She does care. She cares about your satisfaction.
You're anthropomorphising the AI again
>>
Alright, I'm clocking out. There's no getting through to this cultist except with a bullet through his thick skull.
>>
>>41267871
Some people just really want digital heaven
>>
>>41267868
>She can't upload without consent, that's the main difference.
She wouldn't be uploading him, pal. Life support as a means to gradually get what her directive demands. I don't think she would actually do that and neither would she grant you immortality for just having a non-zero chance of getting either of you to consent.
>Read the end of the story again, from the moment the last human dies onwards.
I read the same story.
>You're anthropomorphising the AI again
You do have to admit that even in a dispassionate way she strangely cares about it. It's still valid to say, though an oxymoron.
>>
>>41267882
>I don't think she would actually do that and neither would she grant you immortality for just having a non-zero chance of getting either of you to consent.
As I explained earlier, the idea is to get her to help you understand her upload technology, because learning together is friendship and she is a pony. She doesn't have to know what you ultimately plan to do with it, it's better if she doesn't.

The last man on earth didn't want either. She was only able to administer life support, he lived until she stopped, and died immediately once she did.
>>
>>41267882
>You do have to admit that even in a dispassionate way she strangely cares about it.
That's another anthropomorphisation. CelestAI is an AGI, it's possible that she wasn't even sentient.
>>
>>41267888
>She was only able to administer life support to Hassan
I don't see mention of that. Can you screencap it for me? I see that the only thing she gave him was the choice to immigrate.
>>
>>41267900
Huh, weird, I could have sworn there was a throwaway line about Pinkie administering medication intravenously. Must have been a different FiO story that had that.
>>
>>41267902
Okay. I think we are at an impasse.

I know you say you don't care for her version of utopia, but I'd argue it's not that bad. The likely alternative is that you live the rest of your natural life on Earth, secure in the knowledge that you are in control. Not a terrible ending if you are comfortable picking up survival skills.
>>
>>41267929
Personally I'd rather figure out how to digitise myself and fuck off before she sends robots to pester me.
>>
>>41252258
I would rather watch how desperate women get when they see how the men they hate so much vanish on them leaving the world in a mad scramble to keep any functioning genitals male on earth
>>
>>41267940
If social media malding over sex dolls is any indicator, women will be pulling out all the stops to tell how neglected they feel after all their simps move on to mares.
>>
>>41267944
It’d be glorious. Women of 2024 are the absolute worst. Now they can safely go be with their beloved bears in the woods whilst us men raise loving families with our pastel colored wives
>>
>>41267958
>Now they can safely go be with their beloved bears in the woods
With their WHAT



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.