« College Choice Futures | Main | Double Or Nothing Lawsuits, Ten Years On »

October 29, 2007

Comments

Does this analysis focus on pure, monotone utility, or does it include the huge ripple effect putting dust specks into so many people's eyes would have? Are these people with normal lives, or created specifically for this one experience?

The answer that's obvious to me is that my mental moral machinery -- both the bit that says "specks of dust in the eye can't outweigh torture, no matter how many there are" *and* the bit that says "however small the badness of a thing, enough repetition of it can make it arbitrarily awful" or "maximize expected sum of utilities" -- wasn't designed for questions with numbers like 3^^^3 in. In view of which, I profoundly mistrust any answer I might happen to find "obvious" to the question itself.

Since there was a post about what seems obvious to the speaker might not be to the listener in this blog a few days ago, I thought I would point out that :
It was NOT AT ALL obvious to me what should be preferred, torture 1 man for 50 years or speck of dust in 3^^^3 people. Can you please plase clarify/update what the point of the post was?

The dust speck is described as "barely enough to make you notice", so however many people it would happen to, it seems better than even something a lot less worse than 50 years of horrible torture. There are so many irritating things that a human barely notices in his/her life, what's an extra dust speck?

I think I'd trade the dust specks for even a kick in the groin.

But hey, maybe I'm missing something here...

Anon, I deliberately didn't say what I thought, because I guessed that other people would think a different answer was "obvious". I didn't want to prejudice the responses.

Even when applying the cold cruel calculus of moral utilitarianism, I think that most people acknowledge that egalitarianism in a society has value in itself, and assign it positive utility. Would you rather be born into a country where 9/10 people are destitute (<$1000/yr), and the last is very wealthy (100,000/yr)? Or, be born into a country where almost all people subsist on a modest (6-8000/yr) amount?

Any system that allocates benefits (say, wealth) more fairly might be preferable to one that allocates more wealth in a more unequal fashion. And, the same goes for negative benefits. The dust specks may result in more total misery, but there is utility in distributing that misery equally.

The dust specks seem like the "obvious" answer to me, but how large the tiny harm must be to cross the line where the unthinkably huge number of them outweighs a single tremendous one isn't something I could easily say, when clearly I don't think simply calculating the total amount of harm caused is the right measure.

It seems obvious to me to choose the dust specks because that would mean that the human species would have to exist for an awfully long time for the total number of people to equal that number and that minimum amount of annoyance would be something they were used to anyway.

I too see the dust specks as obvious, but for the simpler reason that I reject utilitarian sorts of comparisons like that. Torture is wicked, period. If one must go further, it seems like the suffering from torture is *qualitatively* worse than the suffering from any number of dust specks.

Anon prime: dollars are not utility. Economic egalitarianism is instrumentally desirable. We don't normally favor all types of equality, as Robin frequently points out.

Kyle: cute

Eliezer: My impulse is to choose the torture, even when I imagine very bad kinds of torture and very small annoyances (I think that one can go smaller than a dust mote, possibly something like a letter on the spine of a book that your eye sweeps over being in a shade less well selected a font). Then, however, I think of how much longer the torture could last and still not outweigh the trivial annoyances if I am to take the utilitarian perspective and my mind breaks. Condoning 50 years of torture, or even a day worth, is pretty much the same as condoning universes of agonium lasting for eons in the face of numbers like these, and I don't think that I can condone that for any amount of a trivial benefit.

Personally, I choose C: torture 3^^^3 people for 3^^^3 years. Why? Because I can.

Ahem. My morality is based on maximizing average welfare, while also avoiding extreme individual suffering, rather than cumulative welfare.

So torturing one man for fifty years is not preferable to annoying any number of people.

This is different when the many are also suffering extremely, though - then it may be worthwhile to torture one even more to save the rest.

Trivial annoyances and torture cannot be compared in this quantifiable manner. Torture is not only suffering, but lost opportunity due to imprisonment, permanent mental hardship, activation of pain and suffering processes in the mind, and a myriad of other unconsidered things.

And even if the torture was 'to have flecks of dust dropped in your eyes', you still can't compare a 'torturous amount' applied to one person, to substantial number dropped in the eyes of many people: We aren't talking about cpu cycles here - we are trying to quantify qualifiables.

If you revised the question, and specified stated exactly how the torture would affect the individual, and how they would react to it, and the same for each of the 'dust in the eyes' people (what if one goes blind? what of their mental capacity to deal with the hardship? what of the actual level of moisture in their eyes, and consequently the discomfort being felt?) then, maybe then, we could determine which was the worse outcome, and by how much.

There are simply too many assumptions that we have to make in this, mortal, world to determine the answer to such questions: you might as well as how many angels dance on the head of a pin. Or you could start more simply and ask: if you were to torture two people in exactly the same way, which one would suffer more, and by how much?

And you notice, I haven't even started to think about the ethical side of the question...

I think this all revolves around one question: Is "disutility of dust speck for N people" = N*"disutility of dust speck for one person"?

This, of course, depends on the properties of one's utility function.

How about this... Consider one person getting, say, ten dust specks per second for an hour vs 10*60*60 = 36,000 people getting a single dust speck each.

This is probably a better way to probe the issue at its core. Which of those situations is preferable? I would probably consider the second. However, I suspect one person getting a billion dust specks in their eye per second for an hour would be preferable to 1000 people getting a million per second for an hour.

Suffering isn't linear in dust specks. Well, actually, I'm not sure subjective states in general can be viewed in a linear way. At least, if there is a potentially valid "linear qualia theory", I'd be surprised.

But as far as the dust specks vs torture thing in the original question? I think I'd go with dust specks for all.

But that's one person vs buncha people with dustspecks.

Oh, just had a thought. A less extreme yet quite related real world situation/question would be this: What is appropriate punishment for spammers?

Yes, I understand there're a few additional issues here, that would make it more analogous to, say, if the potential torturee was planning on deliberately causing all those people a DSE (Dust Speck Event)

But still, the spammer issue gives us a more concrete version, involving quantities that don't make our brains explode, so considering that may help work out the principles by which these sorts of questions can be dealt with.

The problem with spammers isn't the cause of a singular dust spec event: it's the cause of multiple dust speck events repeatedly to individuals in the population in question. It's also a 'tragedy of the commons' question, since there is more than one spammer.

To respond to your question: What is appropriate punishment for spammers? I am sad to conclude that until Aubrey DeGray manages to conquer human mortality, or the singularity occurs, there is no suitable punishment for spammers.

After either of those, however, I would propose unblocking everyone's toilets and/or triple shifts as a Fry's Electronics floor lackey until the universal heat death, unless you have even >less< interesting suggestions.

If you could take all the pain and discomfort you will ever feel in your life, and compress it into a 12-hour interval, so you really feel ALL of it right then, and then after the 12 hours are up you have no ill effects - would you do it? I certainly would. In fact, I would probably make the trade even if it were 2 or 3 times longer-lasting and of the same intensity. But something doesn't make sense now... am I saying I would gladly double or triple the pain I feel over my whole life?

The upshot is that there are some very nonlinear phenomena involved with calculating amounts of suffering, as Psy-Kosh and others have pointed out. You may indeed move along one coordinate in "suffering-space" by 3^^^3 units, but it isn't just absolute magnitude that's relevant. That is, you cannot recapitulate the "effect" of fifty years of torturing with isolated dust specks. As the responses here make clear, we do not simply map magnitudes in suffering space to moral relevance, but instead we consider the actual locations and contours. (Compare: you decide to go for a 10-mile hike. But your enjoyment of the hike depends more on where you go, than the distance traveled.)

Yes the answer is obvious. The answer is that this question obviously does not yet have meaning. It's like an ink blot. Any meaning a person might think it has is completely inside his own mind. Is the inkblot a bunny? Is the inkblot a Grateful Dead concert? The right answer is not merely unknown, because there is no possible right answer.

A serious person-- one who take moral dilemmas seriously, anyway-- must learn more before proceeding.

The question is an inkblot because too many crucial variables have been left unspecified. For instance, in order for this to be an interesting moral dilemma I need to know that it is a situation that is physically possible, or else analogous to something that is possible. Otherwise, I can't know what other laws of physics or logic apply or don't apply, and therefore can't make an assessment. I need to know what my position is in this universe. I need to know why this power has been invested in me. I need to know the nature of the torture and who the person is who will be tortured. I need to consider such factors as what the torture may mean to other people who are aware of it (such as the people doing the torture). I need to know something about the costs and benefits involved. Will the person being tortured *know* they are being tortured? Or can it be arranged that they are born into the torture and consider it a normal part of their life. Will the person being tortured have *volunteered* to have been tortured? Will the dust motes have peppered the eyes of all those people anyway? Will the torture have happened anyway? Will choosing torture save other people from being tortured?

It would seem that torture is bad. On the other hand, just being alive is a form of torture. Each of us has a Sword of Damocles hanging over us. It's called mortality. Some people consider it torture when I keep telling them they haven't finished asking their question...

The non-linear nature of 'qualia' and the difficulty of assigning a utility function to such things as 'minor annoyance' has been noted before. It seems to some insolvable.
One solution presented by Dennett in 'Consciousness Explained' is to suggest that there is no such thing as qualia or subjective experience. There are only objective facts. As Searle calls it 'consciousness denied'.
With this approach it would (at least theoretically) be possible to objectively determine the answer to this question based on something like the number of ergs needed to fire the neurons that would represent the outcomes of the two different choices. The idea of which would be the more/less pleasant experience is therefore not relevant as there is no subjective experience to be had in the first place.
Of course I'm being sloppy here- the word choice would have to be re-defined to include that each action is determined by the physical configuration of the brain and that the chooser is in fact a fictional construct of that physical configuration.
Otherwise, I admit that 3^^^3 people is not something I can easily contemplate, and that clouds my ability to think of an answer to this question.

Uh... If there's no such thing as qualia, there's no such thing as actual suffering, unless I misunderstand your description of Dennett's views.

But if my understanding is correct, and those views were correct, then wouldn't the answer be "nobody actually exists to care one way or another?" (Or am I sorely mistaken in interpreting that view?)

Regarding your example of income disparity: I might rather be born into a system with very unequal incomes, if, as in America (in my personal and biased opinion), there is a reasonable chance of upping my income through persistence and pluck. I mean hey, that guy with all that money has to spend it somewhere-- perhaps he'll shop at my superstore!

But wait, what does wealth mean? In the case where everyone has the same income, where are they spending their money? Are they all buying the same things? Is this a totalitarian state? An economy without disparity is pretty disturbing to contemplate, because it means no one is making an effort to do better than other people, or else no one *can* do better. Money is not being concentrated or funnelled anywhere. Sounds like a pretty moribund economy.

If it's a situation where everyone always gets what they want and need, then wealth will have lost its conventional meaning, and no one will care whether one person is rich and another one isn't. What they will care about is the success of their God, their sports teams, and their children.

I guess what I'm saying is that there may be no interesting way to simplify interesting moral dilemmas without destroying the dilemma or rendering it irrelevant to natural dilemmas.

If even one in a hundred billion of the people is driving and has an accident because of the dust speck and gets killed, that's a tremendous number of deaths. If one in a hundred quadrillion of them survives the accident but is mangled and spends the next 50 years in pain, that's also a tremendous amount of torture.

If one in a hundred decillion of them is working in a nuclear power plant and the dust speck makes him have a nuclear accident....

We just aren't designed to think in terms of 3^^^3. It's too big. We don't habitually think much about one-in-a-million chances, much less one in a hundred decillion. But a hundred decillion is a very small number compared to 3^^^3.

Douglas and Psy-Kosh: Dennett explicitly says that in denying that there are such things as qualia he is not denying the existence of conscious experience. Of course, Douglas may think that Dennett is lying or doesn't understand his own position as well as Douglas does.

James Bach and J Thomas: I think Eliezer is asking us to assume that there are no knock-on effects in either the torture or the dust-speck scenario, and the usual assumption in these "which economy would you rather have?" questions is that the numbers provided represent the situation *after* all parties concerned have exerted whatever effort they can. (So, e.g., if almost everyone is described as destitute, then it must be a society in which escaping destitution by hard work is very difficult.) Of course I agree with both of you that there's danger in this sort of simplification.

J Thomas: You're neglecting that there might be some positive-side effects for a small fraction of the people affected by the dust specks; in fact, there is some precedent for this. The resulting average effect is hard to estimate, but (considering that dust specks seem to mostly add entropy to the thought processes of the affected persons), would likely still be negative.


Copying g's assumption that higher-order effects should be neglected, I'd take the torture. For each of the 3^^^3 persons, the choice looks as follows:

1.) A 1/(3^^^3) chance of being tortured for 50 years.
2.) A 1 chance of getting a dust speck.

I'd definitely prefer the former. That probability is so close to zero that it vastly outweighs the differences in disutility.

Hmm, tricky one.

Do I get to pick the person who has to be tortured?

As I read this I knew my answer would be the dust specks. Since then I have been mentally evaluating various methods for deciding on the ethics of the situation and have chosen the one that makes me feel better about the answer I instinctively chose.

I can tell you this though. I reckon I personally would choose max five minutes of torture to stop the dust specks event happening. So if the person threatened with 50yrs of torture was me, I'd choose the dust specks.

What if it were a repeatable choice?

Suppose you choose dust specks, say, 1,000,000,000 times. That's a considerable amount of torture inflicted on 3^^^3 people. I suspect that you could find the number of times equivalent to torturing each of thoes 3^^^3 people 50 years, and that number would be smaller than 3^^^3. In other words, choose the dust speck enough times, and more people would be tortured effectually for longer than if you chose the 50-year torture an equivalent number of times.

If that math is correct, I'd have to go with the torture, not the dust specks.

Kyle wins.

Absent using this to guarantee the nigh-endless survival of the species, my math suggests that 3^^^3 beats anything. The problem is that the speck rounds down to 0 for me.

There is some minimum threshold below which it just does not count, like saying, "What if we exposed 3^^^3 people to radiation equivalent to standing in front of a microwave for 10 seconds? Would that be worse than nuking a few cities?" I suppose there must be someone in 3^^^3 who is marginally close enough to cancer for that to matter, but no, that rounds down to 0. For the speck, I am going to blink in the next few seconds anyway.

That in no way addresses the intent of the question, since we can just increase it to the minimum that does not round down. Being poked with a blunt stick? Still hard, since I think every human being would take one stick over some poor soul being tortured. Do I really get to be the moral agent for 3^^^3 people?

As others have said, our moral intuitions do not work with 3^^^3.

Wow. The obvious answer is TORTURE, all else equal, and I'm pretty sure this is obvious to Eliezer too. But even though there are 26 comments here, and many of them probably know in their hearts torture is the right choice, no one but me has said so yet. What does that say about our abilities in moral reasoning?

Given that human brains are known not to be able to intuitively process even moderately large numbers, I'd say the question can't meaningfully be asked - our ethical modules simply can't process it. 3^^^3 is too large - WAY too large.

I'm unconvinced that the number is too large for us to think clearly. Though it takes some machinery, humans reason about infinite quantities all the time and arrive at meaningful conclusions.

My intuitions strongly favor the dust speck scenario. Even if forget 3^^^^3 and just say that an infinite number of people will experience the speck, I'd still favor it over the torture.

Robin is absolutely wrong, because different instances of human suffering cannot be added together in any meaningful way. The cumulative effect when placed on one person is far greater than the sum of many tiny nuisances experienced by many. Whereas small irritants such as a dust mote do not cause "suffering" in any standard sense of the word, the sum total of those motes concentrated at one time and placed into one person's eye could cause serious injury or even blindness. Dispersing the dust (either over time or across many people) mitigates the effect. If the dispersion is sufficient, there is actually no suffering at all. To extend the example, you could divide the dust mote into even smaller particles, until each individual would not even be aware of the impact.

So the question becomes, would you rather live in a world with little or no suffering (caused by this particular event) or a world where one person suffers badly, and those around him or her sit idly by, even though they reap very little or no benefit from the situation?

The notion of shifting human suffering onto one unlucky individual so that the rest of society can avoid minor inconveniences is morally reprehensible. That (I hope) is why no one has stood up and shouted yeay for torture.

The obvious answer is TORTURE, all else equal, and I'm pretty sure this is obvious to Eliezer too.

That is the straightforward utilitarian answer, without any question. However, it is not the common intuition, and even if Eliezer agrees with you he is evidently aware that the common intuition disagrees, because otherwise he would not bother blogging it. It's the contradiction between intuition and philosophical conclusion that makes it an interesting topic.

Robin's answer hinges on "all else being equal." That condition can tie up a lot of loose ends, it smooths over plenty of rough patches. But those ends unravel pretty quickly once you start to consider all the ways in which everything else is inherently unequal.
I happen to think the dust speck is a 0 on the disutility meter, myself, and 3^^^3*0 disutilities = 0 disutility.

I believe that ideally speaking the best choice is the torture, but pragmatically, I think the dust speck answer can make more sense. Of course it is more intuitive morally, but I would go as far as saying that the utility can be higher for the dust specks situation (and thus our intuition is right). How? the problem is in this sentence: "If neither event is going to happen to you personally," the truth is that in the real world, we can't rely on this statement. Even if it is promised to us or made into a law, this type of statements often won't hold up very long. Precedents have to be taken into account when we make a decision based on utility. If we let someone be tortured now, we are building a precedent, a tradition of letting people being tortured. This has a very low utility for people living in the affected society. This is well summarized in the saying "What goes around comes around".

If you take the strict idealistic situation described, the torture is the best choice. But if you instead deem the situation to be completely unrealistic and you pick a similar one by simply not giving a 100% reliability on the sentence: "If neither event is going to happen to you personally," the best choice can become the dust specks, depending on how much you believe the risk of a tradition of torture will be established. (and IMO traditions of torture and violence is the kind of thing that spreads easily as it stimulates resentment and hatred in the groups that are more affected.) The torture situation has much risk of getting worst but not the dust speck situation.

The scenario might have been different if torture was replaced by a kind of suffering that is not induced by humans. Say... an incredibly painful and long (but not contagious) illness.

Is it better to have the dust specks everywhere all the time or to have the existence of this illness once in history?

Torture. See Norcross: http://www.ruf.rice.edu/~norcross/ComparingHarms.pdf

Robin, could you explain your reasoning. I'm curious.

Humans get barely noticeable "dust speck equivalent" events so often in their lives that the number of people in Eliezer's post is irrelevant; it's simply not going to change their lives, even if it's a gazillion lives, even with a number bigger than Eliezer's (even considering the "butterfly effect", you can't say if the dust speck is going to change them for the better or worse -- but with 50 years of torture, you know it's going to be for the worse).

Subjectively for these people, it's going to be lost in the static and probably won't even be remembered a few seconds after the event. Torture won't be lost in static, and it won't be forgotten (if survived).

The alternative to torture is so mild and inconsequential, even if applied to a mind-boggling number of people, that it's almost like asking: Would you rather torture that guy or not?

@Robin,

"But even though there are 26 comments here, and many of them probably know in their hearts torture is the right choice, no one but me has said so yet."

I thought that Sebastian Hagen and I had said it. Or do you think we gave weasel answers? Mine was only contingent on my math being correct, and I thought his was similarly clear.

Perhaps I was unclear in a different way. By asking if the choice was repeatable, I didn't mean to dodge the question; I meant to make it more vivid. Moral questions are asked in a situation where many people are making moral choices all the time. If dust-speck displeasure is additive, then we should evaluate our choices based on their potential aggregate effects.

Essentially, it's a same-ratio problem, like showing that 6:4::9:6, because 6x3=9x2 and 4x3=6x2. If the aggregate of dust-specking can ever be greater than the equivalent aggregate of torturing, then it is always greater.

Hmm, thinking some more about this, I can see another angle (not the suffering angle, but the "being prudent about unintended consequences" angle):

If you had the choice between very very slightly changing the life of a huge number of people or changing a lot the life of only one person, the prudent choice might be to change the life of only one person (as horrible as that change might be).

Still, with the dust speck we can't really know if the net final outcome will be negative or positive. It might distract people who are about to have genius ideas, but it might also change chains of events that would lead to bad things. Averaged over so many people, it's probably going to stay very close to neutral, positive or negative. The torture of one person might also look very close to neutral if averaged with the other 3^^^3 people, but we *know* that it's going to be negative. Hmm..

Would you prefer that one person be horribly tortured for fifty years without hope or rest, or that 3^^^3 people get dust specks in their eyes?

The square of the number of milliseconds in 50 years is about 10^21.

Would you rather one person tortured for a millisecond (then no ill effects), or that 3^^^3/10^21 people get a dust speck per second for 50 centuries?

OK, so the utility/effect doesn't scale when you change the times. But even if each 1% added dust/torture time made things ten times worse, when you reduce the dust-speckled population to reflect that it's still countless universes worth of people.

I'm with Tomhs. The question has less value as a moral dilemma than as an opportunity to recognize how we think when we "know" the answer. I intentionally did not read the comments last night so I could examine my own thought process, and tried very hard to hold an open mind (my instinct was dust). It's been a useful and interesting experience. Much better than the brain teasers which I can generally get because I'm on hightened alert when reading El's posts. Here being on alert simply allowed me to try to avoid immediately giving in to my bias.

Averaging utility works only when law of large numbers starts to play a role. It's a good general policy, as stuff subject to it happens all the time, enough to give sensible results over the human/civilization lifespan. So, if Eliezer's experiment is a singular event and similar events don't happen frequently enough, answer is 3^^^3 specks. Otherwise, torture (as in this case, similar frequent enough choices would lead to a tempest of specks in anyone's eye which is about 3^^^3 times worse then 50 years of torture, for each and every one of them).

Benquo, your first answer seems equivocal, and so did Sebastian's on a first reading, but now I see that it was not.

Torture,

Consider three possibilities:

(a) A dusk speck hits you with probability one,
(b) You face an additional probability 1/( 3^^^3) of being tortured for 50 years,
(c) You must blink your eyes for a fraction of a second, just long enough to prevent a dusk speck from hitting you in the eye.

Most people would pick (c) over (a). Yet, 1/( 3^^^3) is such a small number that by blinking your eyes one more time than you normally would you increase your chances of being captured by a sadist and tortured for 50 years by more than 1/( 3^^^3). Thus, (b) must be better than (c). Consequently, most people should prefer (b) to (a).

There isn't any right answer. Answers to what is good or bad is a matter of taste, to borrow from Nietzsche.

To me the example has messianic quality. One person suffers immensely to save others from suffering. Does the sense that there is a 'right' answer come from a Judeo-Christian sense of what is appropriate. Is this a sort of bias in line with biases towards expecting facts to conform to a story?

Also, this example suggests to me that the value pluralism of Cowen makes much more sense than some reductive approach that seeks to create one objective measure of good and bad. One person might seek to reduce instances of illness, another to maximize reported happiness, another to maximize a personal sense of beauty. IMO, there isn't a judge who will decide who is right and who is wrong, and the decisive factor is who can marhsal the power to bring about his will, as unsavory as that might be (unless your side is winning).

Why is this a serious question? Given the physical unreality of the situation, the putative existence of 3^^^3 humans and the ability to actually create the option in the physical universe - why is this question taken seriously while something like is it better to kill Santa Claus or the Easter Bunny considered silly?

Fascinating, and scary, the extent to which we adhere to established models of moral reasoning despite the obvious inconsistencies. Someone here pointed out that the problem wasn't sufficiently defined, but then proceeded to offer examples of objective factors that would appear necessary to evaluation of a consequentialist solution. Robin seized upon the "obvious" answer that any significant amount of discomfort, over such a vast population, would easily dominate, with any conceivable scaling factor, the utilitarian value of the torture of a single individual. But I think he took the problem statement too literally; the discomfort of the dust mote was intended to be vanishingly small, over a vast population, thus keeping the problem interesting rather than "obvious."

But most interesting to me is that no one pointed out that fundamentally, the assessed goodness of any act is a function of the values (effective, but not necessarily explicit) of the assessor. And assessed morality as a function of group agreement on the "goodness" of an act, promoting the increasingly coherent values of the group over increasing scope of expected consequences.

Now the values of any agent will necessarily be rooted in an evolutionary branch of reality, and this is the basis for increasing agreement as we move toward the common root, but this evolving agreement in principle on the *direction* of increasing morality should never be considered to point to any particular *destination* of goodness or morality in any objective sense, for that way lies the "repugnant conclusion" and other paradoxes of utilitarianism.

Obvious? Not at all, for while we can increasingly converge on principles promoting "what works" to promote our increasingly coherent values over increasing scope, our expression of those values will increasingly diverge.

The hardships experienced by a man tortured for 50 years cannot compare to a trivial experience massively shared by a large number of individuals -- even on the scale that Eli describes. There is no accumulation of experiences, and it cannot be conflated into a larger meta dust-in-the-eye experience; it has to be analyzed as a series of discreet experiences.

As for larger social implications, the negative consequence of so many dust specked eyes would be negligible.

Wow. People sure are coming up with interesting ways of avoiding the question.

Eliezer wrote "Wow. People sure are coming up with interesting ways of avoiding the question."

I posted earlier on what I consider the more interesting question of how to frame the problem in order to best approach a solution.

If I were to simply provide my "answer" to the problem, with the assumption that the dust in the eyes is likewise limited to 50 years, then I would argue that the dust is to be preferred to the torture, not on a utilitarian basis of relative weights of the consequences as specified, but on the bigger-picture view that my preferred future is one in which torture is abhorrent in principle (noting that this entails significant indirect consequences not specified in the problem statement.)

Eliezer, are you suggesting that declining to make up one's mind in the face of a question that (1) we have excellent reason to mistrust our judgement about and (2) we have no actual need to have an answer to is somehow disreputable?

As for your link to the "motivated stopping" article, I don't quite see why declining to decide on this is any more "stopping" than choosing a definite one of the options. Or are you suggesting that it's an instance of motivated continuation? Perhaps it is, but (as you said in that article) the problem with excessive "continuation" is that it can waste resources and miss opportunities. I don't see either of those being an issue here, unless you're actually threatening to do one of those two things -- in which case I declare you a Pascal's mugger and take no notice.

The comments to this entry are closed.

Less Wrong (sister site)

May 2009

Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31