"What's the worst that can happen?" goes the optimistic saying.  It's probably a bad question to ask anyone with a creative imagination.  Let's consider the problem on an individual level: it's not really the worst that can happen, but would nonetheless be fairly bad, if you were horribly tortured for a number of years.  This is one of the worse things that can realistically happen to one person in today's world.

What's the least bad, bad thing that can happen?  Well, suppose a dust speck floated into your eye and irritated it just a little, for a fraction of a second, barely enough to make you notice before you blink and wipe away the dust speck.

For our next ingredient, we need a large number.  Let's use 3^^^3, written in Knuth's up-arrow notation:

  • 3^3 = 27.
  • 3^^3 = (3^(3^3)) = 3^27 = 7625597484987.
  • 3^^^3 = (3^^(3^^3)) = 3^^7625597484987 = (3^(3^(3^(... 7625597484987 times ...)))).

3^^^3 is an exponential tower of 3s which is 7,625,597,484,987 layers tall.  You start with 1; raise 3 to the power of 1 to get 3; raise 3 to the power of 3 to get 27; raise 3 to the power of 27 to get 7625597484987; raise 3 to the power of 7625597484987 to get a number much larger than the number of atoms in the universe, but which could still be written down in base 10, on 100 square kilometers of paper; then raise 3 to that power; and continue until you've exponentiated 7625597484987 times.  That's 3^^^3.  It's the smallest simple inconceivably huge number I know.

Now here's the moral dilemma.  If neither event is going to happen to you personally, but you still had to choose one or the other:

Would you prefer that one person be horribly tortured for fifty years without hope or rest, or that 3^^^3 people get dust specks in their eyes?

I think the answer is obvious.  How about you?

Comments

sorted by
magical algorithm
Highlighting new comments since Today at 11:39 AM
Select new highlight date
All comments loaded

The answer that's obvious to me is that my mental moral machinery -- both the bit that says "specks of dust in the eye can't outweigh torture, no matter how many there are" and the bit that says "however small the badness of a thing, enough repetition of it can make it arbitrarily awful" or "maximize expected sum of utilities" -- wasn't designed for questions with numbers like 3^^^3 in. In view of which, I profoundly mistrust any answer I might happen to find "obvious" to the question itself.

Isn't this just appeal to humility? If not, what makes this different?

Robin: dare I suggest that one area of relevant expertise is normative philosophy [email protected]#%(^^$-sake?!

It's just painful -- really, really, painful -- to see dozens of comments filled with blinkered nonsense like "the contradiction between intuition and philosophical conclusion" when the alleged "philosophical conclusion" hinges on some ridiculous simplistic Benthamite utilitarianism that nobody outside of certain economics departments and insular technocratic computer-geek blog communities actually accepts! My model for the torture case is swiftly becoming fifty years of reading the comments to this post.

The "obviousness" of the dust mote answer to people like Robin, Eliezer, and many commenters depends on the following three claims:

a) you can unproblematically aggregate pleasure and pain across time, space, and individuality,

b) all types of pleasures and pains are commensurable such that for all i, j, given a quantity of pleasure/pain experience i, you can find a quantity of pleasure/pain experience j that is equal to (or greater or less than) it. (i.e. that pleasures and pains exist on one dimension)

c) it is a moral fact that we ought to select the world with more pleasure and less pain.

But each of those three claims is hotly, hotly contested. And almost nobody who has ever thought about the questions seriously believes all three. I expect there are a few (has anyone posed the three beliefs in that form to Peter Singer?), but, man, if you're a Bayesian and you update your beliefs about those three claims based on the general opinions of people with expertise in the relevant area, well, you ain't accepting all three. No way, no how.

Torture,

Consider three possibilities:

(a) A dusk speck hits you with probability one, (b) You face an additional probability 1/( 3^^^3) of being tortured for 50 years, (c) You must blink your eyes for a fraction of a second, just long enough to prevent a dusk speck from hitting you in the eye.

Most people would pick (c) over (a). Yet, 1/( 3^^^3) is such a small number that by blinking your eyes one more time than you normally would you increase your chances of being captured by a sadist and tortured for 50 years by more than 1/( 3^^^3). Thus, (b) must be better than (c). Consequently, most people should prefer (b) to (a).

You know, that actually persuaded me to override my intuitions and pick torture over dust specks.

A consistent utilitarian would choose the torture, but I don't think it's the moral choice.

Let's bring this a little closer to home. Hypothetically, let's say you get to live your life again 3^^^3 times. Would you prefer to have an additional dust speck in your eye in each of your future lives, or else be tortured for 50 years in a single one of them?

Any takers for the torture?

Even when applying the cold cruel calculus of moral utilitarianism, I think that most people acknowledge that egalitarianism in a society has value in itself, and assign it positive utility. Would you rather be born into a country where 9/10 people are destitute (<$1000/yr), and the last is very wealthy (100,000/yr)? Or, be born into a country where almost all people subsist on a modest (6-8000/yr) amount?

Any system that allocates benefits (say, wealth) more fairly might be preferable to one that allocates more wealth in a more unequal fashion. And, the same goes for negative benefits. The dust specks may result in more total misery, but there is utility in distributing that misery equally.

Very-Related Question: Typical homeopathic dilutions are 10^(-60). On average, this would require giving two billion doses per second to six billion people for 4 billion years to deliver a single molecule of the original material to any patient.

Could one argue that if we administer a homeopathic pill of vitamin C in the above dilution to every living person for the next 3^^^3 generations, the impact would be a humongous amount of flu-elimination?

If anyone convinces me that yes, I might accept to be a Torturer. Otherwise, I assume that the negligibility of the speck, plus people's resilience, would make no lasting effects. Disutility would vanish in miliseconds. If they wouldn't even notice or have memory of the specks after a while, it'd equate to zero disutility.

It's not that I can't do the maths. It's that the evil of the speck seems too diluted to do harm.

Just like homeopathy is too diluted to do good.

If you could take all the pain and discomfort you will ever feel in your life, and compress it into a 12-hour interval, so you really feel ALL of it right then, and then after the 12 hours are up you have no ill effects - would you do it? I certainly would. In fact, I would probably make the trade even if it were 2 or 3 times longer-lasting and of the same intensity. But something doesn't make sense now... am I saying I would gladly double or triple the pain I feel over my whole life?

The upshot is that there are some very nonlinear phenomena involved with calculating amounts of suffering, as Psy-Kosh and others have pointed out. You may indeed move along one coordinate in "suffering-space" by 3^^^3 units, but it isn't just absolute magnitude that's relevant. That is, you cannot recapitulate the "effect" of fifty years of torturing with isolated dust specks. As the responses here make clear, we do not simply map magnitudes in suffering space to moral relevance, but instead we consider the actual locations and contours. (Compare: you decide to go for a 10-mile hike. But your enjoyment of the hike depends more on where you go, than the distance traveled.)

"If you could take all the pain and discomfort you will ever feel in your life, and compress it into a 12-hour interval, so you really feel ALL of it right then, and then after the 12 hours are up you have no ill effects - would you do it? I certainly would.""

Hubris. You don't know, can't know, how that pain would/could be instrumental in processing external stimuli in ways that enable you to make better decisions.

"The sort of pain that builds character, as they say".

The concept of processing 'pain' in all its forms is rooted very deep in humanity -- get rid of it entirely (as opposed to modulating it as we currently do), and you run a strong risk of throwing the baby out with the bathwater, especially if you then have an assurance that your life will have no pain going forward. There's a strong argument to be made for deference to traditional human experience in the face of the unknown.

Anon prime: dollars are not utility. Economic egalitarianism is instrumentally desirable. We don't normally favor all types of equality, as Robin frequently points out.

Kyle: cute

Eliezer: My impulse is to choose the torture, even when I imagine very bad kinds of torture and very small annoyances (I think that one can go smaller than a dust mote, possibly something like a letter on the spine of a book that your eye sweeps over being in a shade less well selected a font). Then, however, I think of how much longer the torture could last and still not outweigh the trivial annoyances if I am to take the utilitarian perspective and my mind breaks. Condoning 50 years of torture, or even a day worth, is pretty much the same as condoning universes of agonium lasting for eons in the face of numbers like these, and I don't think that I can condone that for any amount of a trivial benefit.

I'll go ahead and reveal my answer now: Robin Hanson was correct, I do think that TORTURE is the obvious option, and I think the main instinct behind SPECKS is scope insensitivity.

Some comments:

While some people tried to appeal to non-linear aggregation, you would have to appeal to a non-linear aggregation which was non-linear enough to reduce 3^^^3 to a small constant. In other words it has to be effectively flat. And I doubt they would have said anything different if I'd said 3^^^^3.

If anything is aggregating nonlinearly it should be the 50 years of torture, to which one person has the opportunity to acclimate; there is no individual acclimatization to the dust specks because each dust speck occurs to a different person. The only person who could be "acclimating" to 3^^^3 is you, a bystander who is insensitive to the inconceivably vast scope.

Scope insensitivity - extremely sublinear aggregation by individuals considering bad events happening to many people - can lead to mass defection in a multiplayer prisoner's dilemma even by altruists who would normally cooperate. Suppose I can go skydiving today but this causes the world to get warmer by 0.000001 degree Celsius. This poses very little annoyance to any individual, and my utility function aggregates sublinearly over individuals, so I conclude that it's best to go skydiving. Then a billion people go skydiving and we all catch on fire. Which exact person in the chain should first refuse?

I may be influenced by having previously dealt with existential risks and people's tendency to ignore them.

While some people tried to appeal to non-linear aggregation, you would have to appeal to a non-linear aggregation which was non-linear enough to reduce 3^^^3 to a small constant.

Sum(1/n^2, 1, 3^^^3) < Sum(1/n^2, 1, inf) = (pi^2)/6

So an algorithm like, "order utilities from least to greatest, then sum with a weight if 1/n^2, where n is their position in the list" could pick dust specks over torture while recommending most people not go sky diving (as their benefit is outweighed by the detriment to those less fortunate).

This would mean that scope insensitivity, beyond a certain point, is a feature of our morality rather than a bias; I am not sure my opinion of this outcome.

That said, while giving an answer to the one problem that some seem more comfortable with, and to the second that everyone agrees on, I expect there are clear failure modes I haven't thought of.

Edited to add:

This of course holds for weights of 1/n^a for any a>1; the most convincing defeat of this proposition would be showing that weights of 1/n (or 1/(n log(n))) drop off quickly enough to lead to bad behavior.

Oh, just had a thought. A less extreme yet quite related real world situation/question would be this: What is appropriate punishment for spammers?

Yes, I understand there're a few additional issues here, that would make it more analogous to, say, if the potential torturee was planning on deliberately causing all those people a DSE (Dust Speck Event)

But still, the spammer issue gives us a more concrete version, involving quantities that don't make our brains explode, so considering that may help work out the principles by which these sorts of questions can be dealt with.

I think this all revolves around one question: Is "disutility of dust speck for N people" = N*"disutility of dust speck for one person"?

This, of course, depends on the properties of one's utility function.

How about this... Consider one person getting, say, ten dust specks per second for an hour vs 106060 = 36,000 people getting a single dust speck each.

This is probably a better way to probe the issue at its core. Which of those situations is preferable? I would probably consider the second. However, I suspect one person getting a billion dust specks in their eye per second for an hour would be preferable to 1000 people getting a million per second for an hour.

Suffering isn't linear in dust specks. Well, actually, I'm not sure subjective states in general can be viewed in a linear way. At least, if there is a potentially valid "linear qualia theory", I'd be surprised.

But as far as the dust specks vs torture thing in the original question? I think I'd go with dust specks for all.

But that's one person vs buncha people with dustspecks.

How bad is the torture option?

Let's say a human brain can have ten thoughts per second; or the rate of human awareness is ten perceptions per second. Fifty years of torture means just over one and a half billion tortured thoughts, or perceptions of torture.

Let's say a human brain can distinguish twenty logarithmic degrees of discomfort, with the lowest being "no discomfort at all", the second-lowest being a dust speck, and the highest being torture. In other words, a single moment of torture is 2^19 = 524288 times worse than a dust speck; and a dust speck is the smallest discomfort possible. Let's call a unit of discomfort a "dol" (from Latin dolor).

In other words, the torture option means 1.5 billion moments × 2^19 dols; whereas the dust-specks option means 3^^^3 moments × 1 dol.

The assumptions going into this argument are the speed of human thought or perception, and the scale of human discomfort or pain. These are not accurately known today, but there must exist finite limits — humans do not think or perceive infinitely fast; and the worst unpleasantness we can experience is not infinitely bad. I have assumed a log scale for discomfort because we use log scales for other senses, e.g. brightness of light and volume of sound. However, all these assumptions can be empirically corrected based on facts about human neurology.

Torture is really, really bad. But it is not infinitely bad.

That said, there may be other factors in the moral calculation of which to prefer. For instance, the moral badness of causing a particular level of discomfort may not be linear in the amount of discomfort: causing three dols once may be worse than causing one dol three times. However, this seems difficult to justify. Discomfort is subjective, which is to say, it is measured by the beholder — and the beholder only has so much brain to measure it with.

Robin is absolutely wrong, because different instances of human suffering cannot be added together in any meaningful way. The cumulative effect when placed on one person is far greater than the sum of many tiny nuisances experienced by many. Whereas small irritants such as a dust mote do not cause "suffering" in any standard sense of the word, the sum total of those motes concentrated at one time and placed into one person's eye could cause serious injury or even blindness. Dispersing the dust (either over time or across many people) mitigates the effect. If the dispersion is sufficient, there is actually no suffering at all. To extend the example, you could divide the dust mote into even smaller particles, until each individual would not even be aware of the impact.

So the question becomes, would you rather live in a world with little or no suffering (caused by this particular event) or a world where one person suffers badly, and those around him or her sit idly by, even though they reap very little or no benefit from the situation?

The notion of shifting human suffering onto one unlucky individual so that the rest of society can avoid minor inconveniences is morally reprehensible. That (I hope) is why no one has stood up and shouted yeay for torture.

J Thomas: You're neglecting that there might be some positive-side effects for a small fraction of the people affected by the dust specks; in fact, there is some precedent for this. The resulting average effect is hard to estimate, but (considering that dust specks seem to mostly add entropy to the thought processes of the affected persons), would likely still be negative.

Copying g's assumption that higher-order effects should be neglected, I'd take the torture. For each of the 3^^^3 persons, the choice looks as follows:

1.) A 1/(3^^^3) chance of being tortured for 50 years. 2.) A 1 chance of getting a dust speck.

I'd definitely prefer the former. That probability is so close to zero that it vastly outweighs the differences in disutility.

Eliezer, a problem seems to be that the speck does not serve the function you want it to in this example, at least not for all readers. In this case, many people see a special penny because there is some threshold value below which the least bad bad thing is not really bad. The speck is intended to be an example of the least bad bad thing, but we give it a badness rating of one minus .9-repeating.

(This seems to happen to a lot of arguments. "Take x, which is y." Well, no, x is not quite y, so the argument breaks down and the discussion follows some tangent. The Distributed Republic had a good post on this, but I cannot find it.)

We have a special penny because there is some amount of eye dust that becomes noticeable and could genuinely qualify as the least bad bad thing. If everyone on Earth gets this decision at once, and everyone suddenly gets >6,000,000,000 specks, that might be enough to crush all our skulls (how much does a speck weigh?). Somewhere between that and "one speck, one blink, ever" is a special penny.

If we can just stipulate "the smallest unit of suffering (or negative qualia, or your favorite term)," then we can move on to the more interesting parts of the discussion.

I also see a qualitative difference if there can be secondary effects or summation causes secondary effects. As noted above, if 3^^^3/10^20 people die due to freakishly unlikely accidents caused by blinking, the choice becomes trivial. Similarly, +0.000001C sums somewhat differently than specks. 1 speck/day/person for 3^^^3 days is still not an existential risk; 3^^^3 specks at once will kill everyone.

(I still say Kyle wins.)

The first thing I thought when I read this question was that the dust specks were obviously preferable. Then I remembered that my intuition likes to round 3^^^3 down to something around twenty. Obviously, the dust specks are preferable to the torture for any number at all that I have any sort of intuitive grasp over.

But I found an argument that pretty much convinced me that the torture was the correct answer.

Suppose that instead of making this choice once, you will be faced with the same choice 10^17 times for the next fifty years (This number was chosen so that it was more than a million per second.) If you have a problem imagining the ability to make more than a million choices per second, imagine that you have a dial in front of you which goes from zero to a 10^17. If you set the dial to n, then 10^17-n people will get tortured starting now for the next fifty years, and n dust specks will fly into the eyes of each of 3^^^3 people during the next fifty years.

The dial starts at zero. For each unit that you turn the dial up, you are saving one person from being tortured by putting a dust speck in the eyes of each of the 3^^^3 people, the exact choice presented.

So, if you thought the correct answer was the dust specks, you'd turn the dial from zero to one right? And then you'd turn it from one to two, right?

But, if you turned the dial all the way up to 10^17, you'd effectively be rubbing the corneas of the 3^^^3 people with sandpaper for fifty years (of course, their corneas would wear through, and their eyes would come apart under that sort of abrasion. It would probably take less than a million dust specks per second to do that, but let's be conservative and make them smaller dust specks.) Even if you don't count the pain involved, they'd be blind forever. How many people would you blind in order to save one person from being tortured for fifty years? You probably wouldn't blind everyone on earth to save that one person from being tortured, and yet, there are (3^^^3)/(10^17) >> 7*10^9 people being blinded for each person you
have saved from torture.


So if your answer was the dust specks, you'd either end up turning the knob all the way up to 10^17, or you'd have to stop somewhere, because there's no escaping that in this scenario, there's a real dial in front of you, and you have to turn it to some n between 0 and a 10^17.


If you left the dial on, say, 10^10, I'd ask "Tell me, what is so special about the difference between hitting someone with 10^10 dust specs versus hitting them with 10^10+1, that wasn't special about the difference between hitting them with zero versus one?" If anything, the more dust specks there are, the less of a difference one more would make.

There are easily 10^17 continuous gradations between no inconvenience and having ones eyes turned to pulp, and I don't really see what would make any of them terribly different from each other. Yet n=0 is obviously preferable to n=10^17, and so, each individual increment of n must be bad.

This has nothing to do with the original question. You rephrased it so that it now asks if you'd rather torture one person or 3^^^3. Of course you rather torture one person than 3^^^3. That does not equal torturing one person or that 3^^^3 people get dust specks in their eyes for a fraction of a second.

Kyle wins.

Absent using this to guarantee the nigh-endless survival of the species, my math suggests that 3^^^3 beats anything. The problem is that the speck rounds down to 0 for me.

There is some minimum threshold below which it just does not count, like saying, "What if we exposed 3^^^3 people to radiation equivalent to standing in front of a microwave for 10 seconds? Would that be worse than nuking a few cities?" I suppose there must be someone in 3^^^3 who is marginally close enough to cancer for that to matter, but no, that rounds down to 0. For the speck, I am going to blink in the next few seconds anyway.

That in no way addresses the intent of the question, since we can just increase it to the minimum that does not round down. Being poked with a blunt stick? Still hard, since I think every human being would take one stick over some poor soul being tortured. Do I really get to be the moral agent for 3^^^3 people?

As others have said, our moral intuitions do not work with 3^^^3.

If even one in a hundred billion of the people is driving and has an accident because of the dust speck and gets killed, that's a tremendous number of deaths. If one in a hundred quadrillion of them survives the accident but is mangled and spends the next 50 years in pain, that's also a tremendous amount of torture.

If one in a hundred decillion of them is working in a nuclear power plant and the dust speck makes him have a nuclear accident....

We just aren't designed to think in terms of 3^^^3. It's too big. We don't habitually think much about one-in-a-million chances, much less one in a hundred decillion. But a hundred decillion is a very small number compared to 3^^^3.

That is an interesting argument (I've considered it before) though I think it misses the point of the thought experiment. As I understand it, it's not about any of the possible consequences of the dust specks, but about specks as (very minor) intrinsically bad things themselves. It's about whether you're willing to measure the unpleasantness of getting a dust speck in your eye on the same scale as the unpleasantness of being tortured, as (vastly) different in degree rather than fundamentally different in kind.

Yes the answer is obvious. The answer is that this question obviously does not yet have meaning. It's like an ink blot. Any meaning a person might think it has is completely inside his own mind. Is the inkblot a bunny? Is the inkblot a Grateful Dead concert? The right answer is not merely unknown, because there is no possible right answer.

A serious person-- one who take moral dilemmas seriously, anyway-- must learn more before proceeding.

The question is an inkblot because too many crucial variables have been left unspecified. For instance, in order for this to be an interesting moral dilemma I need to know that it is a situation that is physically possible, or else analogous to something that is possible. Otherwise, I can't know what other laws of physics or logic apply or don't apply, and therefore can't make an assessment. I need to know what my position is in this universe. I need to know why this power has been invested in me. I need to know the nature of the torture and who the person is who will be tortured. I need to consider such factors as what the torture may mean to other people who are aware of it (such as the people doing the torture). I need to know something about the costs and benefits involved. Will the person being tortured know they are being tortured? Or can it be arranged that they are born into the torture and consider it a normal part of their life. Will the person being tortured have volunteered to have been tortured? Will the dust motes have peppered the eyes of all those people anyway? Will the torture have happened anyway? Will choosing torture save other people from being tortured?

It would seem that torture is bad. On the other hand, just being alive is a form of torture. Each of us has a Sword of Damocles hanging over us. It's called mortality. Some people consider it torture when I keep telling them they haven't finished asking their question...

Anon, I deliberately didn't say what I thought, because I guessed that other people would think a different answer was "obvious". I didn't want to prejudice the responses.

I wonder if some people's aversion to "just answering the question" as Eliezer notes in the comments many times has to do with the perceived cost of signalling agreement with the premises.

It's straightforward to me that answering should take the question at face value; it's a thought experiment, you're not being asked to commit to a course of action. And going by the question as asked the answer for any utilitarian is "torture", since even a very small increment of suffering multiplied by a large enough number of people (or an infinite number) will outweigh a great amount of suffering by one person.

Signalling that would be highly problematic for some people because of what might be read into our answer -- does Eliezer expect that signalling assent here means signalling assent to other, as-yet-unknown conclusions he's made about (whatever issue where that bears some resemblance)? Does Eliezer intend to codify the terms of this premise into the basis for a decision theory underlying the cognitive architecture of a putative Friendly AI? Does Eliezer think that the real world, in short, maps to his gedankenexperiment sufficiently well that the terms of this scenario can meaningfully stand in for decisions made in that domain by real actors (human or otherwise)?

For my own part I'd be very, very hesitant to signal any of that. Hence I find it difficult to answer the question as asked. It's analogous to my discomfort with the Ticking Time Bomb scenario -- by a straight reading of the premise you should trade a finite chance of finding and disabling the bomb, thereby saving a million lives, for the act of torturing the person who planted it. The logic is internally-consistent, but it doesn't map to any real-world situation I can plausibly imagine (where torture is not terribly effective in soliciting confessions, and the scenario of a "ticking time bomb with a single suspect unwilling to talk mere minutes beforehand" has AFAIK never happened as presented, and would be extremely difficult to set up).

I recognize the internal consistency, yet I'm troubled by my uncertainty about what the author thinks I'm signing up for when I reply.

I'd take it.
I find your choice/intuition completely baffling, and I would guess that far less than 1% of people would agree with you on this, for whatever that's worth (surely it's worth something.) I am a consequentialist and have studied consequentialist philosophy extensively (I would not call myself an expert), and you seem to be clinging to a very crude form of utilitarianism that has been abandoned by pretty much every utilitarian philosopher (not to mention those who reject utilitarianism!). In fact, your argument reads like a reductio ad absurdum of the point you are trying to make. To wit: if we think of things in equivalent, additive utility units, you get this result that torture is preferable. But that is absurd, and I think almost everyone would be able to appreciate the absurdity when faced with the 3^^^3 lives scenario. Even if you gave everyone a one week lecture on scope insensitivity.

So... I don't think I want you to be one of the people to initially program AI that might influence my life...

Eliezer, are you suggesting that declining to make up one's mind in the face of a question that (1) we have excellent reason to mistrust our judgement about and (2) we have no actual need to have an answer to is somehow disreputable?

Yes, I am.

Regarding (1), we pretty much always have excellent reason to mistrust our judgments, and then we have to choose anyway; inaction is also a choice. The null plan is a plan. As Russell and Norvig put it, refusing to act is like refusing to allow time to pass.

Regarding (2), whenever a tester finds a user input that crashes your program, it is always bad - it reveals a flaw in the code - even if it's not a user input that would plausibly occur; you're still supposed to fix it. "Would you kill Santa Claus or the Easter Bunny?" is an important question if and only if you have trouble deciding. I'd definitely kill the Easter Bunny, by the way, so I don't think it's an important question.

Followup dilemmas:

For those who would pick SPECKS, would you pay a single penny to avoid the dust specks?

For those who would pick TORTURE, what about Vassar's universes of agonium? Say a googolplex-persons' worth of agonium for a googolplex years.

Wow. People sure are coming up with interesting ways of avoiding the question.

The hardships experienced by a man tortured for 50 years cannot compare to a trivial experience massively shared by a large number of individuals -- even on the scale that Eli describes. There is no accumulation of experiences, and it cannot be conflated into a larger meta dust-in-the-eye experience; it has to be analyzed as a series of discreet experiences.

As for larger social implications, the negative consequence of so many dust specked eyes would be negligible.

Robin, could you explain your reasoning. I'm curious.

Humans get barely noticeable "dust speck equivalent" events so often in their lives that the number of people in Eliezer's post is irrelevant; it's simply not going to change their lives, even if it's a gazillion lives, even with a number bigger than Eliezer's (even considering the "butterfly effect", you can't say if the dust speck is going to change them for the better or worse -- but with 50 years of torture, you know it's going to be for the worse).

Subjectively for these people, it's going to be lost in the static and probably won't even be remembered a few seconds after the event. Torture won't be lost in static, and it won't be forgotten (if survived).

The alternative to torture is so mild and inconsequential, even if applied to a mind-boggling number of people, that it's almost like asking: Would you rather torture that guy or not?

Wow. The obvious answer is TORTURE, all else equal, and I'm pretty sure this is obvious to Eliezer too. But even though there are 26 comments here, and many of them probably know in their hearts torture is the right choice, no one but me has said so yet. What does that say about our abilities in moral reasoning?

It seems obvious to me to choose the dust specks because that would mean that the human species would have to exist for an awfully long time for the total number of people to equal that number and that minimum amount of annoyance would be something they were used to anyway.

Does this analysis focus on pure, monotone utility, or does it include the huge ripple effect putting dust specks into so many people's eyes would have? Are these people with normal lives, or created specifically for this one experience?

Michael Vassar:
Well, in the prior comment, I was coming at it as an egoist, as the example demands.
It's totally clear to me that a second of torture isn't a billion billion billion times worse than getting a dust speck in my eye, and that there are only about 1.5 billion seconds in a 50 year period. That leaves about a 10^10 : 1 preference for the torture.
I reject the notion that each (time,utility) event can be calculated in the way you suggest. Successive speck-type experiences for an individual (or 1,000 successive dust specks for 1,000,000 individuals) over the time period we are talking about would easily overtake 50 years of torture. It doesn't make sense to tally (total human disutility of torture (1 person/50 years in this case))(some quantification of the disutility of a time unit of torture) vs. (total human speck disutility)(some quantification of a unit of speck disutility).
The universe is made up of distinct beings (animals included), not the sum of utilities (which is just a useful contruct.)
All of this is to say:
If we are to choose for ourselves between these scenarios, I think it is incredibly bizarre to prefer 3^^^3 satisfying lives and one indescribably horrible life to 3^^^3 infinitesimally better lives than the alternative 3^^^3 lives. I think doing so ignores basic human psychology, from whence our preferences arise.