[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/sci/ - Science & Math


Thread archived.
You cannot reply anymore.


[Advertise on 4chan]


If you one-box in Newcomb's problem I legitimately think you might not be human.
You are a cattle incapable of intelligent thought.
All you can do is become confused about topics you don't understand.
You write a bunch of symbols and assume that just because similar symbols are assigned meaning in pure mathematics then your worthless gibberish is meaningful too.
One boxers lose money.
Two boxers stay winning.
The only way one boxers cope with their fallacious reasoning and poverty-inducing answer is by magically assuming that the conditions of the problem change when if they're one boxing. Which they don't since the whole point problems aimed at comparing decisions and decision procedures is to start at identical conditions and compare the outcomes of two decisions.
You will take one box and it will be empty. You will go back home sad, poor and alone.
Enjoy your piece of carton.
>>
Let me guess, you got btfo in the main thread?
>>
>>16928946
>ai will correctly predict your choice ahead of time with high accuracy
>this is the premise upon which the whole thought experiment is based
>npcs cannot entertain hypotheticals due to laxk of abstract thoughts
>the inescapable conclusion from the premise is two boxers don't like money
this cannot be refuted
>>
>>16928946
>>16931033
But I did have breakfast today
>>
>>16928946
>watch someone else enter the room, pick one box, get a million
>watch another person, they pick 2 boxes only get $1,000
>this time, the person seems to be really smart and thinks really hard, picks 2 boxes, still only gets $1,000
>this happens a thousand more times
>”huh… this AI supercomputer alien God sure is good at predicting things… it’s almost as if everyone who picks 1 box is guaranteed to get a million, and everyone who picks 2 boxes will only get $1,000
>”yippee, now it’s my turn !”
>”hmmm…. I think I will take both!”
are 2-boxers fucking retarded?
>>
>>16928946
would your answer change if it was $0.01 instead of $1000?
>>
>>16928946
>One boxers lose money.
>Two boxers stay winning.
You just saw and witnessed 1000 one-boxers leave with a million bucks and 1000 two-boxers leave with a measly $1000 and you really went ahead and posted that.
>>
>>16929093
He sounds like a bit of a two boxer to me..
>>
>>16928946
How is this even a paradox? Picking one box is the obvious choice
>>
>>16928946
>All you can do is become confused about topics you don't understand.
>You write a bunch of symbols and assume that just because similar symbols are assigned meaning in pure mathematics then your worthless gibberish is meaningful too.
Funny, in the other thread it was two-boxers (or maybe just you) that were namedropping 'strategic dominance', 'object permanence' and other clever-sounding concepts the applicability of which they don't understand.
>>
>>16928946
one-box choosers are greedy liars who act like if they wouldn't take both, then the predictor would read such decision and give them a million dollars. their choice should not be taken seriously, and they should be called out for being the lying faggots they are
>>
Being called all that is a cheap price for a million dollars.
>>
>>16928946
This is not even a paradox: you assume an inconsistent set T (*) of axioms (so that T proves everything) and then say that among the people who believe that A holds under T, and those who believe B holds under T, then one of these crowds is dumb or misled.
When in fact T both proves A and B because it is inconsistent, hence it proves everything.

(*) T contains (or proves) the following claim
There is at least two different people such that
(1°) the first one is able to predict the future without making any mistakes while doing so;
(2°) the second has free arbiter.
>>
>>16931033
>muh AI
kys
>>
Imagine you have a debt of 900,000 dollars and it's ruining your life and driving you to suicidal thoughts. You would take one box only no question about that. Who gives a flying crap if two-boxers have some better argument or logic or if you get an extra 1000 dollars, who cares. If a thousand one-boxers went into the room and came back with a million dollars then you would just take the one box if that somehow ensures you the million dollars. You just want the debt to go away no matter what it takes.
>>
>>16931039
Yes.
>>
File: cat-sphere.gif (963 KB, 498x373)
963 KB
963 KB GIF
I open up the box, I take out the money, I then take a gigantic steaming shit on the money and smear every single bill with my defecate.
It's unbacked FIAT slop.
Burn the money changing tables yall, don't just flip them over.

https://youtu.be/ohlW9SbhLzc?si=2ylhvrjp1OveIC5x
>>
>>16928946
Nice bait op

Honestly they could just delete the rest of the scenario and skip straight to
>there's a 100% chance of getting $1 million if you pick this box
>there's a 100% chance of getting $1k if you pick both boxes

The rest is just flavor text.

>>16929046
I truly believe two-boxers have a pathological sense of entitlement, that's the only way their "reasoning" makes sense.
Ha! If it was me, I would take both boxes, imagine being satisfied with one million and leaving a thousand dollars on the table... What do you mean, there wouldn't be a million dollars for me? That doesn't make any sense, there's one million for you when you take one box, but there isn't for me when I take two boxes? Like... there's never a million dollar for me to take in the first place? But there magically is for you? How could there be 0 in my box and 1 million in yours? You idiot, it was all predetermined before we entered the room, so naturally there ought to be 1 million in my box too. Or else there wouldn't be a choice in the first place, my box would be empty, even though there's 1 million in yours. Now that's just preposterous, of course there's $1 million waiting for me to take it alongside the $1k, you fucking moron. Just watch... $0?! In my box?!! This "superintelligence" must be retarded! Good thing I didn't take one box!
>>
File: 1773545137124038.jpg (24 KB, 450x579)
24 KB
24 KB JPG
My main position is that this is an impossible hypothetical and so it doesn't really make too much sense debating this.

In any situation where anything like the paradox's situation comes up in real life, we don't have to try to explain to eachother what we really mean or imagine the "predictor" to do. If we got a more hands-on version of the predictor, the answer is likely obvious.

The rational thing irl to do is to take two boxes. Meanwhile, when you're verbally asked the puzzle, you're giving the questioner an answer to a situation conditioned on a "magical" scenario. "imagine a hyper-accurate preditor exists". Well if something like this exist, whether it be aliens or some mind reading device, then sure conditioned on that strange existence, go with one box.

Because clearly nobody of us will actually get into this situtation, it's hypothetical circle jerking. To not be rude, answer, which means condition on the hypothetical situation of the question, and give the latter answer.
Anybody seem my take formulated somewhere?
>>
>>16928946

Winning strategy is to flip a coin and let that make your decision.
YOU choosing will lose with high probability, since the AI will have correctly guessed your decision.

WTF is everyone retarded on this?
>>
File: COME HOME.jpg (277 KB, 1500x1027)
277 KB
277 KB JPG
>Walk into room with boxes
>The universe cannot progress until I make a yes/no, left/right, black/white, go/no-go, yin/yang decision
>Hold the universe hostage with indecision and indecisiveness
>Choose no boxes
>DoorMa'amm(You) I've Come To Bargain
>The room has become its own self contained relativity unit
>Time passes faster
>The boxes and the contents of the boxes begin to experience nuclear decay
>The drywall of the room itself deteriorates
>The end has come
>The lord of time awaits your challenge.....
>You step through the portal and the final battle begins.....
>>
niggas dont even define their game correctly and leave it up for interpretation, and call that a "paradox"
>>
>>16928946
Too many veritardium threads as of late.
>>
>>16933410
Flipping a coin is strictly worse than picking one box. Even if we assume the oracle isn't capable of predicting it it still lowers your expected value. And based on the premise, it is capable of predicting coin flips. You aren't the first person to have thought of flipping a coin. Every retard who thinks they're being clever has already flipped a coin and it hasn't affected the outcome of the predictions. Everybody who has taken one box has walked out with a million dollars and everybody who has taken both has left with 1,000. This includes people who flipped coins, rolled dice, or any other stupid games unoriginal faggots try to play with it. The prediction has been right every single time.
You're also assuming the oracle is adversarial. It isn't. It's not trying to make you lose. You don't have to go in endless circles where it's constantly trying to fuck you over no matter which action you make. Just take one box and get your million dollars bank or take both and walk home with a measly 1,000. That's it. It's not a trick.
>>
>>16931033
So two boxers are true christians while one boxers are of the synagogue of satan
>>
>>16933410
peak dunning kruger lmao
>>
>>16933414
The joke is on you, as you have already chosen the first box through such a literal imagination of the machinations behind the veil. In truth, it unfolds into abstract psychedelic splendor.
>>
>>16934829
This can be modeled with a Monte Carlo, right?
>>
>>16928946
The fucking computer has a supernatural ability to predict your intention. Set your intention on the mystery box and the computer will reward you.

If the computer DIDN'T have a supernatural predictive power, but instead made unreliable (and realistic) guesses, then of course you should take both boxes.
>>
>>16934829
The robot is a statistical morphism for the actions of the population. The two boxers, the dumb wretches they be, are quite literally your adversary. Like a woman, they just can't help it. This is their nature.
>>
>>16934829
>Everybody who has taken one box has walked out with a million dollars and everybody who has taken both has left with 1,000.
You assume that the experiment has been done multiple times with different people. The paradox itself doesn't say anything about repetition, but what if I go one step further and just assume that I will have more chances in the future to pick a box regardless of my choice and the outcome now?

In this case, the smartest move is still to one-box, because even if the oracle mispredicts me the first couple times, then I can be almost sure that someday, I WILL have the opportunity to get the million. If I two-box from the start, I will NEVER have the chance to get a million. (No, you will not have the opportunity to two-box a thousand times to get your million.)

Imagine I work at a new job, where I have to guard a vault with money inside. There are two vaults: A small one with a thousand dollars inside (box A), and a big one with a million dollars inside (box B). As a new employee, my employer (the oracle) predicts me as untrustworthy, so they will put me on guard duty at the small vault only, and I will not be given access to the big vault (box B is empty).

If I steal the money at day one (that is, I take the thousand dollars, which means I am effectively two-boxing, because if I steal everything from vault A I would surely also steal everything from vault B, if I had access to it.), then they will never give me access to vault B, and I doubt I will have many more chances to grab the thousand dollars again.

If I really wanted to get the million, then I need to build trust, so I will not steal the money from vault A, which is basically one-boxing (meaning I leave work everyday with 0 stolen dollars), because I only want to get what's under box B, and I don't care about the measly thousand dollars. And someday, I will be given access to vault B, and the million will be mine.
>>
>>16935623
>You assume that the experiment has been done multiple times with different people
No I don't. It's literally stated in the premise you fucking retard. Don't reply to me if you're fucking retarded.



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.