« Academia Clumps | Main | Evaluability (And Cheap Holiday Shopping) »

November 27, 2007

Comments

No, honestly, this only shows that most people are almost unbelievably stupid and uneducated in even the most basic logic. There is nothing else to say, except call for radical changes in our educational system, along the lines of what Steven Pinker suggested not long ago.

A 7% probability versus 10% probability may be bad news, but it's more than made up for by the increased number of red beans. It's a worse probability, yes, but you're still more likely to win, you see.


I don't understand. Do you mean you are more likely to win with 7 red beans rather than one but also proportionately more likely to lose with 93 non red beans rather than 9? You wink and suggest there is some great wisdom there. I simply don't even know what the hell you are talking about.

Topo, it's a simple unprobabilistic phase inversion topography manifold calculation, I can hardly see how you could fail to understand it.

Ha, Spock vs McCoy. I think Kirk's position was that it's the affect heuristic that makes us warm, cuddly, and human, data processors, even if it can be faulted in some artificial situations..
This ties in with the other thread about how far we look down possible chains of results in deciding on an action. We're wired to look to proximal results with high affect, and I'm all for it.

The three parts of that paper that I found most interesting were:

1. Concentrated affect beats diffuse affect. Everybody knows what "obnoxious" means but "intelligent" could mean alot of different things, therefore obnoxious wins, carries a higher weight in the averaging of the descriptions.
"More precise affective impressions reflect more precise meanings and carry more weight in impression formation, judgment, and decision making."

2. The fact that more people chose to accept a gamble when a small loss was involved, because the small size of the loss (5 cents) qualified the size of the gain (9 dollars).
"In commenting on the fact that the carriers of value are changes in wealth or welfare, rather than final states, Kahneman and Tversky observe that “Our perceptual apparatus is attuned to the evaluation of changes or differences rather than to the evaluation of absolute magnitudes” (p. 277)."

3. The conclusion of the Damasio 1990 paper which showed that disruption in brain centers linked to affective states disrupted personality to the point of making people sociopathic.
From that paper:
"An investigation of this theory in patients with frontal damage reveals that their autonomic responses to socially meaningful stimuli are indeed abnormal, suggesting that such stimuli fail to activate somatic states at the most basic level."

I don't understand the meaning of "somatic" in this context, can anyone help me out?

My understanding of the Damasio paper's implication is that affect is central to being able to function socially.

Thats a whole lot of insights crammed into 40 pages!

BTW, significant data was withheld in the examples given :
a) how many dips do you get at the jellybeans ? Do the red ones taste better ? What is their market value with the current weak dollar ?
b) 10,000 people overall or 10,000 infected people ? Degree of infectiousness of the disease ?
But that's what the affect heuristic is for : taking decisions in situations of incomplete data. 150 people is a single bounded set, 98% of x people sounds as though it just might be a replicable set. Go for it.

One of the things I found interesting in Eliezer's chapter on biases from his site was the repeated cautions about always being aware that these biases can affect us as well, even when we're aware of them. I certainly wouldn't trust the judgement of someone who chalks them up to the belief "most people are almost unbelievably stupid."

That chapter was a great read, btw.

All people are unbelievably stupid most of the time. Some people just manage to stop now and then.

"It's a worse probability, yes, but you're still more likely to win, you see. You should meditate upon this thought until you attain enlightenment as to how the rest of the planet thinks about probability."

rest of planet = retards

The first terrifying shock comes when you realize that the rest of the world is just so incredibly stupid.

The second terrifying shock comes when you realize that they're not the only ones.

Or consider the report of Denes-Raj and Epstein (1994): Subjects offered an opportunity to win $1 each time they randomly drew a red jelly bean from a bowl, often preferred to draw from a bowl with more red beans and a smaller proportion of red beans. E.g., 7 in 100 was preferred to 1 in 10.

How many times do I get to draw, and is it with or without replacement? If I get to draw every bean in the bowl, the number of non-red beans doesn't matter. ;)

"I proudly include myself in the idiot category... no matter how smart you are, you spend much of your day being an idiot." - Scott Adams, wise man

"[I]t's a simple unprobabilistic phase inversion topography manifold calculation..."

Tosh. This ignores the salience of the linear data elicitation projected over dichotomous variables with a fully specified joint distribution.

So now five people have made the same comment, all with the same length (1 to 3 sentences), all with a relatively similar, bland style of expression. Caledonian incidentally also made the same comment. Hmmm...

I wasn't trying to say the rest of the planet is stupid. I'm saying that "probability" is a more difficult concept than it seems. E.g. Mr. Spock predicts a 98% chance of the Enterprise being destroyed, and he does this twenty times and it never happens once. That's the scriptwriter's concept of what the word "probability" means, and it's very closely related to the jellybean problem.

Probability is a "more difficult concept than it seems", you say, but in what sense is it difficult? It does not require a vast and complex formalism to avoid the sort of error we see in the jellybean problem, so clearly it is not an inherently difficult error to avoid. If it is a "difficult concept", then, it's difficult because our brains are fundamentally not wired to deal with it appropriately, which is a failure of the brain, or colloquially a "stupidity".

See also: Straw Vulcan, MillionToOneChance

Spock is half right; the reason the Enterprise isn't destroyed is the MillionToOneChance effect that, in fiction, makes what would otherwise be objectively improbable outcomes more likely because they make for a better story. Spock's just not smart enough to realize that the reason that the Enterprise never does get destroyed is that he's a character in TV show. ;)

On the other hand, maybe he's just afraid of the consequences of breaking the fourth wall...

In fairness to analysts, if you are judging stocks that nobody is familiar with, or even worse, that nobody except for people who are complete morons are familiar with, then the risk-return relationship will break down. In general, judging whether an investment is fairly priced depends on your confidence in the judgement of the informed traders (which may include you, if the investment is familiar). The ordinary economic theory you cite does not apply when the market may become inefficient.

Statistics is actually fun, as the notion of probability is so non-intuitive. There's a 1 in 6 chance of throwing a deuce. What does that mean in the real world ? Well, if I throw the die 6 times, it should come up once ? euh no... Well if I throw 100 sequences of 6 throws I can predict the number of times the deuce will show up ? euh, no.... Well, if I throw 1000 runs of 100 sequences of 6 throws...... sorry, you still don't know one damn thing about what the result will be. So what does probability mean ? It's great ! One of life's rich prizes is to watch someone making a prediction on a particular instance based on statistical reasoning.

I ran across a curious misunderstanding of probability in the SF novel Diamond Mask. In the murder mystery plotline of the book, the protagonist had collected and analyzed data on an (implicitly mutually exclusive and exhaustive) list of eight or nine suspects. The author used probabilities of lower than 20% as a shorthand for not too likely, probabilities of between 20% and 50% as moderately likely, and probabilities above 50% as indicating prime suspects. Unfortunately, there was ~300% total probability in the list. The author could have gotten away with it if she'd just used the word "likelihood" instead of "probability".

I don't think these people are quite as silly as is made out. Let's look at the morality rate example. When you give a morality rate instead of casualty figures, you haven't necessarily communicated what that means for a community, or what it means on a large scale. That information is *implied*, but you haven't handed it to people on a silver platter. A wise person would create that knowledge himself -- he'd realize that if 20% die, and 5k people are infected, that's 1k dead. He'd think of lots of things like that. He'd figure out what it means in a variety of contexts. And he wouldn't pass judgment until he really understood the situation.

What is alleged about people seems to be that they have very bad judgment, or they are irrational. But if my analysis is correct, that need not be the case. We can explain the data simply in terms of widespread ignorance of how to draw consequences out of percentage figures, ignorance of how to create understanding of the implications of a technical fact.

If that's the case, we could approach the problem by thinking about how to communicate more useful information to people, and also how to educate people on how to think well. That is a hopeful and approachable conclusion.

Elliot, I suspect something is missing from your comments. The technocratic knowledge you are describing is multiplication. It sounds like you are calling for greater education in basic arithmetic, or perhaps telling people "and use it." Knowing that 20% of 5,000 is 1,000 is not the mark of an exceptionally wise person; it is the mark of a competent elementary school student. There is perhaps a reason why we can support a game show called "Are You Smarter Than a 5th Grader?"

I do not have immediate access to the Yamagishi article. Were people actually presented with 1,286/10,000 versus 24.14%, or just asked about one (and people tended to react more strongly to absolute numbers than percentages)? The former is really bad. I suppose there is a story to be told about thinking that maybe few people get the 24.14% disease, or that the 98% of 150 measure is applied repeatedly while the 150 measure works just once, or you get many draws without replacement from the bean bowls. I don't know that those are plausible stories.

Don't we expect people to react differently to the same numbers in different contexts? Eliezer has alread hit Anchoring and Adjustment. Is it a similar bias, innumeracy, or something else that causes people to react differently to "17,520 times per year" versus "twice an hour"?

The issue is not multiplication.

Suppose we "put things in perspective" by comparing the figures 1286 and 10000 to quantities people understand better. In my case, we might note my hometown had a bit over 10k people, and the high school had a bit under 1286. That could give me a less abstract understanding of what that kind of casualty rate means. With that understanding, I might be able to make a better judgment about the situation, especially if, like many people, I dislike math and numbers. (Which is perfectly reasonable given how they were subjected to unpleasant math classes for years.)

What about that 24% figure? Well, it contains within itself less hints of what to apply it to in order to understand it. We aren't handed numbers we already know how to relate to our experience. It may be harder to get started.

In other words, thinking of a new perspective provides *new knowledge* about the situation, that was not contained in the information communicated to the study participants. It was implied, by so were infinitely many other things. There is much skill in knowing what implications find and follow. So, this contextualizing knowledge must be created, and many people don't know to do so, or do so poorly. The study questions which are more helpful to people in creating this kind of knowledge may *understandably and reasonably* result in people making better judgments, because they present more useful information.

implications *to* find and follow (missing word)

Zubon, knowing when to use multiplication, how to use multiplication, why to use multiplication, and doing so reflexively and without outside prompting, is a bit more technocratic than you might think. Have you ever tried to teach math to someone who is not good at math?

Elliot wrote:
"I don't think these people are quite as silly as is made out. "
"What is alleged about people seems to be that they have very bad judgment, or they are irrational."

Clearly human beings have a brain relatively well suited to their world which is, nevertheless, far from infallible. Hence stock market crashes, wars, and all manner of other phenomena which demonstrate the imperfect judging ability of the human mind. The human mind commits errors. One needn't condemn the human mind, or the average capacity of humanity, in order to point out these errors and speculate as to their causes– as this seems to be a fruitful endeavor for learning more about the function of the mind, which I think the above-linked chapter demonstrates very well.
One need not pass any sort of value judgement relating to decision makers– in fact, its far better if one doesn't because that is only distracting and ultimately polemical. We want to measure the precision with which the human mind models reality, and what its sources of error are. So it is in the end completely irrelevant in the context of this discussion if this or that group of decision makers is labelled as "super smart", "idiotic", "irrational", "silly", etc. The point is to investigate the underlying processes.


So now five people have made the same comment, all with the same length (1 to 3 sentences), all with a relatively similar, bland style of expression.

Great minds think alike. And fools seldom differ.

Eliezer, we could spend a long time commiserating on that one. I used to think the problem was that people never learned algebra properly, but I have begun to wonder how many have a firm grasp on applying second grade math. The hard part seems to be knowing what to divide or multiply by what (teaching Bayes' Theorem is fun for this). Real life is all story problems.

Recent adventures in math include baffling a room with the insight that 12*5/12=5 and explaining how to figure out what percent of 1200 300 is. Perhaps I should be more worried about the technocratic difficulties of addition; Division of Labour has an occasional series of "The Diff."

Eliezer is correct that lots of people are very bad at calculating probabilities, and there are all kinds of well known biases in calculating when affect gets involved, especially small sample biases when one is personally aware of an outlier example, especially a bad one.

However, the opening example is perfectly fine. Eliezer even has it: the higher insurance is to cover the real emotional pain of losing the more personally valued grandfather clock. How much we subjectively value something most certainly depends on the circumstances of how we obtained it. There is nothing irrational about this whatsoever. Rationality above all involves following that old advice of Polonius: know thyself.

With 7 beans in a hundred, I can just keep drawing beans until I get $14 worth, where with 1 in ten, the most I can get is $2. Not only that, I get to eat a hundred free jelly beans. This doesn't seem too mysterious to me.

Barkley Rosser,

The monetary payout isn't higher for the more emotionally valuable object -- it's $100 in both cases. If you missed that, that could explain why people paid more for it; they ignored the dollar figure and assumed that the more valuable item was insured for more.

But if you didn't miss that... Are you suggesting that the $100 is more valuable when it coincides with a greater misfortune?

Benquo,

You are right. I misread it. The first case is one of irrationality.

"A 7% probability versus 10% probability may be bad news, but it's more than made up for by the increased number of red beans. It's a worse probability, yes, but you're still more likely to win, you see. You should meditate upon this thought until you attain enlightenment as to how the rest of the planet thinks about probability."

I think this says less about probability and more about people's need to keep an optimistic outlook on life. You emphasize the positive fact that there's an "increased number of red beans", while ignoring the equally true fact that there's also a far greater increase in the number of non-red beans. This tends to support the cliched wisdom that people tend to filter out bad news, and hear only what they want to hear. It's a pretty good reflection of human nature.

The comments to this entry are closed.

Less Wrong (sister site)

May 2009

Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31