« Beware Panic Panic | Main | Pox On Both Houses »

April 30, 2009

Comments

Do we understand our reasons better when we have stronger incentives to do so? Or do we adjust our confidence upward while being just as wrong as these people were about the jam?

Alternative conclusion: People don't pay a whole lot of attention to the jam flavors they select, or more generally to very minor decisions.

Going from "We often pay little attention to minor decisions" to "we don't know the reasons for our choices generally" is a rather large leap.

Yeah, I'm with psychohistorian. I'm not surprised they got these results in simple shopping decisions. Analyzing one's own behaviour during these kinds of choices reveals that we only use our most instinctual preferences. It's not really clear yet to how many kinds of decisions this process extends, and introducing emotion and/or morality brings further complications.

It is of course a lot more expensive to test how much attention we are paying to the more important choices we make.

How about the 1/3 to 1/2 of people who did spot the switch?

The subjects of this experiment were apparently random (or perhaps self-selected!) shoppers at a supermarket, which means that the experiment was getting about an average distribution of humans according to mental ability; possibly even below average to the extent that they are self-selected.

Could the rationalizations observed in 1/2 to 2/3 of the subjects not be defense mechanisms which average and below-average people evolve in order to hide their mental disadvantage?

I have observed that people whose mental abilities aren't stellar do try to hide this - by grooming and dressing well and otherwise emphasizing a well-kept exterior; by responding to what they don't understand either with "wise" silence or with a platitude that they hope will not widely miss the mark; by evading requests for clarification with vagueness.

These all appear to be mechanisms that people develop in order to hide that they're inept, because showing people how little of the world they understand (relative to a few others who do) might cost them their livelihoods.

It is no wonder, then, that people rationalize - and perhaps even do so consciously and purposefully, trying as they always do to appear confident despite inner confusion.

I contend that Dr. Hanson has no idea why he posted this.

The fact that the participants were asked to verbally report on their choices introduces the potential for a social psychological explanation. The participants may not have wanted to confront the researcher with the fact that their jars were mislabeled. Especially in a situation where the participant has no personal investment in the outcome, they have little motivation to confront a percieved authority figure (the researcher).

Strikingly, people detected no more than a third of all these trick trials.

The article doesn't say how this was ascertained. When I clicked through to the original New Scientist article, it doesn't fully load but what I read gives no answer.

It makes a difference whether the participants were asked this, or it was assumed from their willingness to defend "their" choice, or they had to volunteer it.

To echo Steven's comment, what happens when you ask them if the second taste was the same as the first, with a $20 reward for a correct answer?

While people are very blind with some choices, they are very specific with others. It seems reasonable to say that incentives are the differences (which does not bode well for tests of voter choice blindness).

There are many decisions to which we give little thought, but there's a spectrum. In my own experience, there are decisions I make without thinking too long about it (like commenting on this post) and there are ones that I have a clear and verbalizable justification for *before* I make them. Those decisions are ones which affect money and will probably have to be defended to others who control my money - i.e., at work. It would be more interesting to do this experiment with cash rewards, where justifying the reasoning coherently after the fact leads to additional reward.

I recently read Gerd Gigerenzer's "Gut Feelings: The Intelligence of the Unconscious" which had a section on this type of experiment. Apparently, being forced to **verbalize** why or how you made a particular choice seems to interfere with your memory of what you chose. The book discussed experiments with several different types of choices for the subjects to make, and the results were pretty similar.

The problem before that is that we are not wise enough to avoid making mistakes that we haven't even conceived. So what is the importance of reasoning about our actions before or after if we do not understand/know what they produce.

Sometimes people have trouble distinguishing flavors. This story is making the rounds today:

Can People Distinguish Pâté from Dog Food?

"Considering the similarity of its ingredients, canned dog food could be a suitable and inexpensive substitute for pâté or processed blended meat products such as Spam or liverwurst. However, the social stigma associated with the human consumption of pet food makes an unbiased comparison challenging. To prevent bias, Newman's Own dog food was prepared with a food processor to have the texture and appearance of a liver mousse. In a double-blind test, subjects were presented with five unlabeled blended meat products, one of which was the prepared dog food. After ranking the samples on the basis of taste, subjects were challenged to identify which of the five was dog food. Although 72% of subjects ranked the dog food as the worst of the five samples in terms of taste (Newell and MacFarlane multiple comparison, P<0.05), subjects were not better than random at correctly identifying the dog food."

The comments to this entry are closed.

Less Wrong (sister site)

May 2009

Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31