I am an easily bored Omega-level being, and I want to play a game with you.
I am going to offer you two choices.
Choice 1: You spend the next thousand years in horrific torture, after which I restore your local universe to precisely the state it is in now (wiping your memory in the process), and hand you a box with a billion dollars in it.
Choice two: You spend the next thousand years in exquisite bliss, after which I restore your local universe to precisely the state it is in now (wiping your memory in the process), and hand you a box with an angry hornet's nest in it.
Which do you choose?
Now, you blink. I smile and inform you that you made your choice, and hand you your box. Which choice do you hope you made?
You object? Fine. Let's play another game.
I am going to offer you two choices.
Choice 1: I create a perfect simulation of you, and run it through a thousand simulated years of horrific torture (which will take my hypercomputer all of a billionth of a second to run), after which I delete the simulation and hand you a box with a billion dollars in it.
Choice 2: I create a perfect simulation of you, and run it through a thousand simulated years of exquisite bliss (which will take my hypercomputer all of a billionth of a second to run), after which I delete the simulation and hand you a box with an angry hornet's nest in it.
Which do you choose?
Now, I smile and inform you that I already made a perfect simulation of you and asked it that question. Which choice do you hope it made?
Let's expand on that. What if instead of creating one perfect simulation of you, I create 2^^^^3 perfect simulations of you? Which do you choose now?
What if instead of a thousand simulated years, I let the boxes run for 2^^^^3 simulated years each? Which do you choose now?
I have the box right here. Which do you hope you chose?
First of all, thank you for the question. I found it thought provoking and will be thinking on it later.
Second of all, I want to let you know I am replacing all of the "Events which then get deleted with no observable consequence" as "Magical gibberish." for part of my initial answer, because it feels like the dragon from http://lesswrong.com/lw/i4/belief_in_belief/ where there are no observable effects, regardless of how many years, simulated me's, or simulated me years, you throw into it.
I also note that depends on whether or not I expect to make the choice repeatedly. Ergo: I make the choice once:
1: Tortuous Magical Gibberish, and you get a billion dollars.
2: Blissful Magical Gibberish, and you get angry bees.
In that case the billion seems a fairly clear choice. But if you're going to pull some sort of "Wait, before you open the box, I'm going to give you the same choice again, forever." shenanigans, then the billion/hornets never is reachable and all I have left is never ending magical gibberish. In which case, I'll take the Blissful magical gibberish, as opposed to the Torturous magical gibberish.
Oddly, I'd be more reluctant to take the billion if you said you were going to torture OTHER people, who weren't me. I feel like that says something about my value system/thought processes, and I feel like I should subject that something to a closer look. It makes me feel like I might be saying "Well, I think I'm a P-zombie, and so I don't matter, but other people really do have qualia, so I can't torture them for a billion." but maybe it's just imagining what another person who didn't accept my premises would say/do upon finding out about that I had made that choice, or perhaps it's "Absolute denial macro: Don't subject countless others to countless years of torture for personal gain even if it seems like the right thing to do."
I'm not sure which if any of those is my true rejection, or even if I'd actually reject it.
As I said before though I'm still thinking about some of this and would not be surprised if there is a flaw somewhere in my thinking above.
In a billion years your whole life might have no evidence of it ever happening. Does that mean it's magical gibberish?