I am an easily bored Omega-level being, and I want to play a game with you.
I am going to offer you two choices.
Choice 1: You spend the next thousand years in horrific torture, after which I restore your local universe to precisely the state it is in now (wiping your memory in the process), and hand you a box with a billion dollars in it.
Choice two: You spend the next thousand years in exquisite bliss, after which I restore your local universe to precisely the state it is in now (wiping your memory in the process), and hand you a box with an angry hornet's nest in it.
Which do you choose?
Now, you blink. I smile and inform you that you made your choice, and hand you your box. Which choice do you hope you made?
You object? Fine. Let's play another game.
I am going to offer you two choices.
Choice 1: I create a perfect simulation of you, and run it through a thousand simulated years of horrific torture (which will take my hypercomputer all of a billionth of a second to run), after which I delete the simulation and hand you a box with a billion dollars in it.
Choice 2: I create a perfect simulation of you, and run it through a thousand simulated years of exquisite bliss (which will take my hypercomputer all of a billionth of a second to run), after which I delete the simulation and hand you a box with an angry hornet's nest in it.
Which do you choose?
Now, I smile and inform you that I already made a perfect simulation of you and asked it that question. Which choice do you hope it made?
Let's expand on that. What if instead of creating one perfect simulation of you, I create 2^^^^3 perfect simulations of you? Which do you choose now?
What if instead of a thousand simulated years, I let the boxes run for 2^^^^3 simulated years each? Which do you choose now?
I have the box right here. Which do you hope you chose?
This is actually kind of interesting. The only thing that makes me consider picking choice one is the prospect of donating the billion dollars to charity and saving countless lives, but I know that's not really the point of the thought experiment. So, yeah, I'd choose choice two.
But the interesting thing is that, intuitively, at least, choosing choice 2 in the first game seems much more obvious to me. It doesn't seem rational to me to care if a simulation of you is tortured any more than you would a simulation of someone else. Either way, you wouldn't actually ever have to experience it. The empathy factor might be stronger if it's a copy of you - "oh shit, that guy is being tortured!" vs. "oh, shit, that guy that looks and acts a lot like me in every single way is being tortured!", but this is hardly rational. Of course, the simulated me has my memories, so he perceives an unbroken stream of consciousness flowing from making the decision into the thousand years of torture, but who cares. That's still some other dude experiencing it, not me.
So, yes, it seems strange to consider the memory loss case any differently. At least I cannot think of a justification for this feeling. This leads me to believe that the choice is a purely altruistic decision, i.e. it's equivalent to omega saying "I'll give you a billion dollars if you let me torture this dude for 1000 years". In that case, I would have to evaluate whether or not a billion-dollar dent in world hunger is worth 1000 years of torture for some guy (probably not) and then make my decision.