I am an easily bored Omega-level being, and I want to play a game with you.
I am going to offer you two choices.
Choice 1: You spend the next thousand years in horrific torture, after which I restore your local universe to precisely the state it is in now (wiping your memory in the process), and hand you a box with a billion dollars in it.
Choice two: You spend the next thousand years in exquisite bliss, after which I restore your local universe to precisely the state it is in now (wiping your memory in the process), and hand you a box with an angry hornet's nest in it.
Which do you choose?
Now, you blink. I smile and inform you that you made your choice, and hand you your box. Which choice do you hope you made?
You object? Fine. Let's play another game.
I am going to offer you two choices.
Choice 1: I create a perfect simulation of you, and run it through a thousand simulated years of horrific torture (which will take my hypercomputer all of a billionth of a second to run), after which I delete the simulation and hand you a box with a billion dollars in it.
Choice 2: I create a perfect simulation of you, and run it through a thousand simulated years of exquisite bliss (which will take my hypercomputer all of a billionth of a second to run), after which I delete the simulation and hand you a box with an angry hornet's nest in it.
Which do you choose?
Now, I smile and inform you that I already made a perfect simulation of you and asked it that question. Which choice do you hope it made?
Let's expand on that. What if instead of creating one perfect simulation of you, I create 2^^^^3 perfect simulations of you? Which do you choose now?
What if instead of a thousand simulated years, I let the boxes run for 2^^^^3 simulated years each? Which do you choose now?
I have the box right here. Which do you hope you chose?
A lot of people are going for Choice 1 of Game 2 with the idea that it isn't appropriate to give ethical consideration to a simulation of one's self.
This is just silly. There's no guarantee that a simulation of myself will share the same sentiment about the situation. The illusion of acceptability here would go away once the simulations of myself reason any of the following possibilities:
A) As the seconds of subjective experience accumulate, they'll be less and less me as the torture sets in, and more and more someone else, if you accept the premise that our experiences feed-back into our perception of who we are and how the world works, rendering the self part of self inflicted pain pretty blurry.
B) I suspended my sense of compassion under the belief that different versions of myself don't need that kind of consideration. This belief was due to a further belief that since the simulations were of myself, it was acceptable to inflict said pain. The moral weight of self inflicted pain without actually experiencing the pain, since conceptually, no one else is experiencing it. Problem: If I considered the simulations to be enough a part of my own experience to not be considered in the class of other people who need to be shown decency, why wasn't I compelled to choose the thousand years of bliss out of self interest? Then they would see the inconsistencies of the "acceptable self harm" argument, and see it as more of a rationalization. Which it is.
Suppose there was a skeleton on an island somewhere, belonging to somebody who had no contact with civilization before he died, yet spent the last years of his life in unimaginable suffering. Suppose that there was an easy to execute magical action that would alleviate said suffering occurring in the past without need to worry about changing the future. Would you feel compelled to do it? If yes, then you must also feel compelled to rescue simulations of yourself, because their lack of current existence puts them on the same footing as dead simulations.