I am an easily bored Omega-level being, and I want to play a game with you.
I am going to offer you two choices.
Choice 1: You spend the next thousand years in horrific torture, after which I restore your local universe to precisely the state it is in now (wiping your memory in the process), and hand you a box with a billion dollars in it.
Choice two: You spend the next thousand years in exquisite bliss, after which I restore your local universe to precisely the state it is in now (wiping your memory in the process), and hand you a box with an angry hornet's nest in it.
Which do you choose?
Now, you blink. I smile and inform you that you made your choice, and hand you your box. Which choice do you hope you made?
You object? Fine. Let's play another game.
I am going to offer you two choices.
Choice 1: I create a perfect simulation of you, and run it through a thousand simulated years of horrific torture (which will take my hypercomputer all of a billionth of a second to run), after which I delete the simulation and hand you a box with a billion dollars in it.
Choice 2: I create a perfect simulation of you, and run it through a thousand simulated years of exquisite bliss (which will take my hypercomputer all of a billionth of a second to run), after which I delete the simulation and hand you a box with an angry hornet's nest in it.
Which do you choose?
Now, I smile and inform you that I already made a perfect simulation of you and asked it that question. Which choice do you hope it made?
Let's expand on that. What if instead of creating one perfect simulation of you, I create 2^^^^3 perfect simulations of you? Which do you choose now?
What if instead of a thousand simulated years, I let the boxes run for 2^^^^3 simulated years each? Which do you choose now?
I have the box right here. Which do you hope you chose?
i'm only going to consider the first one. The obvious thing to do is to pick the bees and hope for the bees, and it's an incredibly clear illustration of a situation where you might interpet the necessary unpleasant consequences of a good decision, as negative feedback about that decision, in the form of regretting the possibility of hornets. It pinpoints that feeling and it should help to push it away any other time you might be in abject pain or experiencing other lesser discomfort, e.g. after you, say, go to the gym for the first time. it really pinpoints that false temptation.
There is an argument for box 1 though: with a billion dollars and the perfect proof of your own credibility to yourself, and bearing in mind that any impairing trauma caused by the torture would be erased, it's possible that you could do more direct good than a thousand years torture is bad, and that the indirect good you could do (in bringing about positive sum games and opposing negative sum ones, being a part of establishing a better pattern for all of society, by gaining power and using it to influence society away from negative sum interactions, would be bigger again.) And of course I'd love to discover that I was that crazy, that altruistic, that idealistic, that strong. There's a part of me that wants to just say fuck it. In fact, bearing in mind the possibility of immortality or at least great expansion before I die/cryonics runs out or fails to work, do I want to be the guy who chose the bliss or the resources? Fuck it, I want to be the second guy. Throw me in the box before I change my mind.
I like the cut of your jib. Upvoted.