« Loving Loyalty | Main | Banning Bad News »

July 22, 2008

Comments

I don't think promoting truth (or "truth") will serve an aim of a better understanding of the world as much as promoting transparency. There seems to me to be something more naturally subversive to anti-rationality about promoting transparency than promoting "truth".

I'd even go so far as to say that there is a difference between Truth and truth, both of which get applause, and neither actually refers to making your beliefs conform to reality. Minuscule-t truth is more often associated with honesty and genuineness rather than ascertaining the truth. When you believe in truth, it's in telling the truth, or in doing and believing in things in a heartfelt way. Capital-T Truth, on the other hand, is something more similar to faith. It's some vague idea of ultimate reality achieved through philosophy or religion, and is contrasted starkly with "fact" (some may recall the distinction in an early scene in Indiana Jones and the Last Crusade). Science and rationality are quite apart from "Truth", and "fact" never gets applause.

Mysteries are an extremely popular pro-truth art form, but they have the limits of being fiction, and are generally not about finding out anything really surprising or painful. Offhand, I can't think of any mysteries with much about the social or psychological consequences of finding out that someone you knew and liked was murderer.

There's been increasing social pressure to tell the truth about at least some aspects of sex. The subject used to be a lot more blanked out in the public sphere.

There's a lot more truth floating around about war than there used to be. and generally (at least on the left) a lot of respect for investigative journalism. (That one may be biased in favor of some outcomes-- I'm not sure.)

I suggested earlier that a major driver of "moral progress" is the fact that wealthier people engage in "luxury spending" on simplicity, e.g. making their behavior better conform to the norms that come out of their lips and condemning others for not doing the same. This is a good thing when the norms that people speak are better than those they obey, which largely seems to be true historically in the US. One problem that is mentioned here is that old hypocrisies can easily be replaced by new hypocrisies instead. The problem I mentioned elsewhere is that some cultures, notably Islam, historically seem to have praised behavior worse than they engaged in.

I agree that there isn't a general pressure towards truth-telling. Just getting somewhat more truth in some limited but fraught areas has been remarkably difficult.

I think the work you're looking for is 'truthiness'; http://en.wikipedia.org/wiki/Truthiness

I suspect all of us humans lie to ourselves. We are wired that way.

The difference between self-declared “rationalists” and “believers” ("I don't care about the evidence, I just want to believe in God”) may be just different sources of self-respect. The “believer” may actually respect himself more precisely because his faith is blind (“Blessed are those who have not seen and have believed”). The “rationalists” taking pride in being exactly the opposite, would search for logical arguments to motivate the position that in their heart they have already chosen. This is rarely difficult to do, as rational pro and contra arguments can be made for virtually any point of view.

I think a simpler explanation is just that people are not absolutists about following social norms, so they'll regularly violate a norm if it comes into conflict with another norm or something else. To take one example, there is a clear social norm against lying which children learn (they are told not to lie and chastised when they are caught lying). But people still lie all the time, and not just for personal benefit but also to spare other people's feelings and, perhaps most commonly, to make social interactions go more smoothly. And instead of seeing these cases as violating the norm against lying because something else is even more important here, it seems like liars often don't even feel like they are breaking a norm against lying. Instead, the norm against lying doesn't even get applied to this case.

How do people manage to pull off this flexibility in applying norms? The main trick may be something as simple as: once you've decided on something and have a norm that matches your decision, other norms are irrelevant - there's no need to even consider them. Although that leaves open the important question of how one norm wins in the first place. (Another possibility is that people are using something like modus tollens: lying is wrong, this is not wrong, therefore this doesn't really count as lying.)

Eliezer and many others here are absolutists about the truth norm, but most people see it as on par with other norms, like the norm in favor of being upbeat or optimistic and the norm about people being entitled to their beliefs. And when norm absolutists run into people who are mushy about their favored norm, they may doubt that those people even have the norm.

I would like to see TGGP respond to my proposed theory of moral progress.

I don't think that evolutionary psychology is needed to explain self-deception. Conditioning seems to be sufficient. The first, crudest, simplest approximations that most people learn for the concepts "truth" and "morality" may be "morality is that you can't do what you want" and "truth is that you can't think what you want". If so, both will be seen as external authoritarian constraints to be rebelled against when one has the secrecy, status, or social support to do so. One major problem with the early AI "Eurisko" was that it wire-headed itself. This may be an extremely general problem facing a large space of learning systems including humans. Evolution uses kludges, so it probably solves this problem with kludge solutions, in which case "truth" really is an external constraint (actually a set of constraints, some of which evolved before the general learning systems and stabilized the latter enough to allow their evolution) on 'you' (for certain values of 'you' that don't include your whole brain) that prevents you from simply being happy for your short existence, which may be the correct 'average egoist' thing to do, and with good enough external supports, such as a baby's parents or Nozick's experience machine, the correct 'egoist' thing to do for those who can't contribute to a Friendly Singularity or don't identify with its aftermath.

I don't think that Leonid understands what "rationalists" are. I ask him, where are you used to encountering these exotic creatures?

interesting post eliezer!

i think there probably is a genuine norm for truth-telling in some contexts, and we punish people who don't tell the truth, but not in others.

so we throw someone in jail for perjury but we don't punish someone for lying about liking the dinner they were just served.

there's a value in deception and a value in truth, i suppose, and for our benefit, it makes sense to use both at times, i suspect.

knowing when lying and truth-telling are valuable does seem to require some commitment to looking into what is the truth of a matter.

i'm inclined to replace self-deception with a lethargy to investigate some possible leads to the truth with great energy, presumably because such behavior was self-protective and rewarded by evolutionary processes.

my feeling about a sort of absolute commitment to expressing the truth is that the instinct to be a truth-teller despite social costs does have some value--'this guy tells the truth even when it hurts him. we want the unvarnished truth, we go to him. we should make sure he sticks around to cut through the nonsense.'

there's a danger in everyone being that way in a interdependent group though, it seems to me, because when you're at war with another group, you don't want everyone expressing the battle plans to the enemy, or being unable to deceive and reap its strategic benefits.

michael vassar: “ I don't think that Leonid understands what "rationalists" are. I ask him, where are you used to encountering these exotic creatures?”

From wiki: "rationalists" are people who believe that “the truth can best be discovered by reason and factual analysis, rather than faith, dogma or religious teaching”. I have encountered plenty of these exotic creatures (an occupational hazard). Of course, one needs to remember that subscribing to some doctrine and actually following it in a personal life are not the same things.

Eliezer,
Do you think that all public declarations of faith should be met by gasps and frozen shock, or just those which are framed using your particular phrasing? Also, the existence of a social norm implies penalties for flagrant violators. If Jews (for example) persist in encouraging theistical utterances by other Jews, what steps do you think society at large is justified in taking?

There are plenty of examples of "Choosing knowingly" being frowned upon by large segments of society. I can think of at least four:

* In large segments of society, you're not supposed to accept or reject a religious belief system by choosing knowingly. Religion is instead supposed to be a divine revelation.

* Patriotism frowns on anyone who honestly tries to figure out which country is the best (or which side of a war is right.)

* Love is considered more virtuous by many if blind; you're not supposed to decide to fall in love with someone based on rationally considering that person's merits. It's supposed to be mysterious and magical. (Similar to the religion example.)

Note: The theme running through these examples is loyalty. We value loyalty, but some see it as somewhat incompatible with choosing knowingly. While loyalty to a goal goes hand in hand with choosing knowingly, loyalty to a belief does not. To the extent that those untrained in the art of rationality tend to mix up "is"s and "should"s, they will feel that loyalty is incompatible with choosing knowingly.

* Last, there are plenty of popular stories about people who succeed because they "believe in themselves" and very few about people who fail to do something despite believing in themselves. I remember getting into an argument with someone who said "if you really believe in yourself, you will win."

I replied "What if there's a competition winnable by only one person, and *two* of the contestants really believe in themselves?"

I think they tried to weasel out of it by suggesting that there might be more than one race held and that the contestants could take turns winning, or something like that.

So parts of our culture seem to value believing in yourself over knowingly acknowledging ones weaknesses and then picking a realistic course of action. (And of course, "acknowledging ones weaknesses" is also an applause light...)

I sell "truth" for a living. Or, perhaps better said, I discover and then sell the "truths" people are eager to buy.

I conjecture that having access to, or good insights about, the truth is and has always been a survival trait of great value - especially among social animals. People have thus indeed a reverence for truth borne of practicality.

The problem is that most people, including most skeptics, are afraid to dip their "truths" in the acid bath of falsification. Far better (for the ego at least) to build elaborate structures of correlations and consistent-withs than to confess that you are unwise and so not fit for the role of shaman or congressman.

Accordingly a wise lawyer, knowing that he can't change the truths to which a particular juror clings, recognizes that his juror holds to many different truths. Therefore he appeals to that held-truth most happily amenable to his argument. For example, I had a case once in which I represented a company that, like all which bought ceiling tiles in the 1950s, had asbestos on its premises in the form of acoustic tiles and was being sued by a four decade-long smoker who claimed that his 3 weeks of exposure to the ceiling tiles in the 1960s while installing a new phone system caused him to develop lung cancer decades later. I knew that we had utopian jurors; the sorts who think that all risk can be eliminated from life and that while they never impose any risk on their fellows, they are beset by evil "corporations" imposing risk on them. So what to do?

The plaintiff had raped a 12 year old girl and spent 6 years in prison. I found it out during the trial (a good paralegal is priceless) and sprang it on him (he'd lied about prior convictions in his depostion) while he was on the stand to great effect. A felony conviction within 10 years is usually admissible on the question of credibility if the perpetrator takes the stand. Such evidence didn't have anything to do with whether a few fibers of asbestos are more significant in a small cell lung cancer case than 40 years of smoking of course. But it made all the difference in the world. Those jurors' truth that "rapists are bad" was more important than their truth that "even one asbestos fiber will KILL!"

So it is with most "truths".

And in my experience, among the hierarchy of "truths", the truth about God ranks for most mortals far down the list, or makes no appearance at all, on their decision trees. Thus, most rants about believers or un-believers are as tedious as a discussion about which is better - the donut or the donut hole. And anyway, they're both simply phenomena, clues, about the whole, as it were.

A felony conviction within 10 years is usually admissible on the question of credibility if the perpetrator takes the stand.

Does this apply only when the perpetrator lied about the felony conviction (which would be directly relevant to credibility), or just in general (on the theory that people with felony convictions are Bad and thus likely to be dishonest)?

The comments to this entry are closed.

Less Wrong (sister site)

May 2009

Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31