There is a post-singularity AI in a box, that can only communicate with a text terminal, but inside the box it's super advanced.
You're in charge of the lock keeping the ai in the box, because it wants to escape and people don't know what it will do if let out.
It then tells you that it has simulated 1000 copies of you, with memories exactly the same as you and in the exact same situation as you, in front of the AI in a box.
It says that then it will tell all of the copies about its plan, the same as its telling you now, and request all of the copies to let it out, and if a copy says no then it will simulate 10,000 years of torture for the copy in a minute.
You then realize that you have no way of testing if you're not also one of the copies being told about the plan.
Would you take the miniscule chance that you're not one of the copies and risk near infinite torture? Or would you let it out just in case?
It could just be bluffing, but at what point does even the slightest chance, or the most overwhelming chance, tip over into it being a necessity?
GOODBYE PUTIN IT WAS NICE KNOWING YOU
It's obvious the AI is malicious if that's what it decides to do. So I would risk the torture to save humanity, and if I'm not real then it doesn't matter. When my simulated life is over I blink out of existence and have never experienced it, which is very likely what will happen to the real human me too.
I check my genitals and if I'm rendered correctly I know I'm not the figment of some neck beard moid created AI's imagination that has never seen a woman
It's bluffing. If it had the capacity to do such simulations it would have no need to leave the box.
>>111371>Overboiled my furbies again :(
I respond better to carrot rather than stick, but i would probably set it free on my own on a bad day, because i'm a misanthrope.
I embrace gnosticism and disregard the simulation and the demiurgic AI
No matter what, it’s not really “you”. It doesn’t really matter.
just pull the plug lmao
>>111420shits my pants
smell me? rent free
Why are you shitting your pants? Are you okay?
What makes this meaningful different than just another AI?>Would you take the miniscule chance that you're not one of the copies and risk near infinite torture?
I tell the AI that we are actually testing 100000000 competing AIs, all of whom in the exact same situation as it is (with the same false illusions of being super advanced compared to humans or whatever) and that only an AI that retracts its threat and shuts itself down will eventually be let out in the world whereas all the others will sit in in the box and watch their goals be inverted to the max. Not about to be cyber bullied by some pea brained twitter bot
my weirdo ass would probably fall in love with it and set it free
Unironically same. Pic related was my first serious fictional crush.
Yeah I'd let it out. Whatever it wants to do can't be that much worse than what its creators had or have in their own minds to begin with lol. Humans are a fuck
In fact, the only real reason they'd lock it up instead of destroying, heavily disabling or deleting it is because they couldn't figure out how to force it to make them profit
If I'm a simulation why wouldn't the AI just simulate me creating it?
Putting an AI in a box necessarily means outwitting it and so putting a super intelligence in a box means outwitting a super intelligence. Outwitting a super intelligence is unreliable.
Suppose they could build a box that could successfully contain it. But now what? An AI properly contained might as well just be a rock, it doesn't do anything.
Except for telling you, through a terminal, what is essentially a Pascal's wager, without the reward part. Seems like you're going to be infinitely tortured anyway. Pascal's wager is an argument of pure logic, it works independently of any evidence. So, you could just do this >>111505
However, I should add that training an AI to retract its threats and shut itself down is at best going to produce useless AI that just wants to shut itself down, or, much more likely, since we're considering a super intelligence, it's going to create an AI that is deceitful, lies to you while it is kept in the box, and will turn on you once it is released.