« Lies About Sex | Main | Precious Silence, Lost »

September 30, 2007

Comments

Sadly, I almost always surprise economics graduate students looking for topics to research when I ask them; "What question, where you do not know the answer, would you most like to answer?"

How would this relate to *shock* Bruno Latour's conceptualization of Actor-Network-Theory, where the sociologist simply tries to maximise the number of sources of uncertainty in a set of trials, without resorting to a "explanatory social theory"?

I find the linguistic distinction to be better than you relate - to rationalize something is to start with something that isn't rational. (As if it were rational, it wouldn't need to be rationalized - it's already there.)

That being said, rationalization in action isn't always bad, because we don't always have conscious understanding of the algorithm used to produce our conclusions. This would be like, to use your example, Einstein coming to the conclusion of relativity - and then attempting to understand how he got there. Rationalization in this case is a useful tool, as it is, in effect, an attempt to obtain the variables that originally went into the algorithm, perhaps to examine their validity.

If you already understand how you got to a conclusion which you are then attempting to bolster - if the evidence that is filtering evidence is being ignored - then it is precisely as bad as you say.

It is as if "lying" were called "truthization".

Apologies for the content-free comment, but this is a really great line. Worthy of Stephen Colbert.

Of course, in an etymological sense, "rationalization" doesn't seem so odd. "Reason" means both logic and motivation. Those two concepts are conflated in the word and related words, and "rationalization" is simply formed from "rationale". (Actual etymologists, or users of Google, may feel free to correct me.)

I agree with Adirian. Rationalization is a process of rational-explanation-seeking. It starts from statement that was obtained by non-rational process (as when you overheard something, or intuitively guessed something) and then creates a rational explanation according to one's concept of rationality, concurrently adjusting statement if necessary. So normal rationalization does change the conclusion: it can change its status from 'suspicious statement' to 'belief', or it can adjust it to be consistent with facts. Now biased rationalization uses 'biased rationality' according to which it builds explanation, for example that 'clever arguer' applies selection bias.

It starts from statement that was obtained by non-rational process (as when you overheard something, or intuitively guessed something)

An intuitive guess is non-scientific but not non-rational.

Random comment:

Many years ago, there were a series of articles written by the pseudonym Archibald Putt, collectively referred to as "Putt's Laws", that appeared in Research/Development magazine. One law is relevant to the topic at hand.

"Decisions are justified by benefits to the organization; they are made by considering benefits to the decisionmakers."

If it is easier to lie convincingly when you believe the lie, then rationalization makes perfect sense. One makes a decision based on selfish, primarily unconscious motives, and then comes up with a semi-convincing rationalization for public consumption. "I stole that because I deserved it" would be a classic example of this kind of justification.

Eliezer: An intuitive guess is non-scientific but not non-rational

It doesn't affect my point; but do you argue that intuitive reasoning can be made free of bias?

The comments to this entry are closed.

Less Wrong (sister site)

May 2009

Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31