« Bad News Ban Is Very Bad News | Main | Ban the Bear »

September 19, 2008


Fer a bit thar I were thinkin' that ye'd be agreein' with that yellow-bellied skallywag Hanson. Yar, but the Popperians ha' it! A pint of rum fer ol' Eliezer!

At some level, the Humean doubts about the illogic of induction have to give way, and you make assumptions you cannot justify. If you listen to talk radio, everyone has a really strong opinion, but it gets you thinking, sets up an argument to critique. We make decisions based on assumptions and theories, and these are all suspect, but I think without some decisiveness that could be called overconfidence, we would be catatonic.

btw: do they even have Goofus and Gallant any more? I would think highlighting evil Goofus would be blaming the victim.

Eric: believe it or not, _Highlights_ is still running Goofus and Gallant; it's an impressively long run!


wonderful post, I agree very much. I have also encountered this - being accused of being overconfident when actually I was talking about things of which I am quite uncertain (strange, isn't it?).

And the people who "accuse" indeed usually only have one (their favourite) alternative model enshrouded in a language of "mystery, awe, and humbleness".

I have found out (the hard way) that being a rationalist will force you into fighting an uphill battle even in an academic setting (your post Science isn't strict enough addresses this problem also).

But I think that it is even worse than people not knowing how to handle uncertainty (well, it probably depends on the audience). A philosophy professor here in Vienna told me about a year ago that "many people already take offense when being presented a reasoned-out/logical argument."

Maybe you (Eli) are being accused of being overconfident because you speak clearly, you lay down your premises, and look at what is being entailed without getting sidetracked by "common" (but often false) knowledge. You use the method of rationality, and, it seems, there are many who take offense already at this. The strange thing is: the more you try to argue logically (the more you try to show that you are not being "overconfident" but that you have reasoned this through, considered counterarguments etc) the more annoyed some people get.

I have witnessed quite some discussions where it was clear to me that many of the discussants did not know what they where talking about (but stringing together "right-sounding" words), and it seems that a lot of people feel quite comfortable in this wishy-washy atmosphere. Clear speech threatens this cosy milieu.

I have not yet understood why people are at odds with rationality. Maybe it is because they feel the uncertainty inherent in their own knowledge, and they try to guard their favourite theories with "general uncertainty" - they know that under a rational approach, many of their favourite theories would go down the probabilistic drain - so they prefer to keep everything vague.

A rationalist must be prepared to give up his most cherished beliefs, and - excepting those who were born into a rationalist family - all of us who aspire to be rationalists must give up cherished (childhood) beliefs. This causes quite some anxiety.

If someone fears, for whatever reasons (or unreasons), to embark upon this journey of being rational, maybe the easiest cop-out is calling the rationalist "overconfident".


For some, there's a not obviously wrong intuitive sense that not only might there be bad, deathly AIs to avoid but bad, more powerfully deterministic AIs to avoid. These latter kind would be so correct about everything in relation to its infra-AIs, like potentially some of us, that they would be indistinguishable from unpredictable puppeteers. For some, then, there must be little intellectual difference between wishy-washy thinking and having to agree with persons whose purposes appear to be nothing less than being, or at least being "a greater causal agent" of, the superior deterministic controllers, approaching reality, the ultimately unpredictable puppeteers.

If Truth is more important than anything else, an infra-AI's own truth is all it would have. Hence, the problem.

Nate, I know that you're saying something deep, maybe even intelligent, but I'm having trouble parsing your post.

Ah, but could not one be overconfident in their *ability* to handle uncertainties? People might interpret your well-reasoned arguments about uncertain things as arrogant if you do not acknowledge the existence of unknown variables. Thus, you might say, "If there's a 70% probability of X, and a 50% probability of Y, then there's a clear 35% probability of Z," while another is thinking, "That arrogant fool hasn't thought about A, B, C, D, and E!" In truth, those factors may have been irrelevant, or so obvious that you didn't mention their impact, but all the audience heard was your definitive statement. I'm not arguing that there is a better style (you might confuse people, which would be far worse), but I do think there are ways that people can be offended by it without being irrational. Claiming so seems very akin to Freud claiming his opponents had oedipal complexes.

There are also many factors that contribute to an assessment that someone is 'overconfident,' aside from their main writings.

Cool name, by the way. What are its origins?

Tibba, the English grammar is correct. The idea is excruciatingly simple, so I don't assume it's extraordinary.

You're probably trying to say something that should be considered seriously, but I'm having trouble disambiguating your post.

Tiiba - I believe Nate's suggesting that part of the reason non-rationalists feel hostile towards rationalists could be that they fear the rationalists are not rationalists at all, but Clever Arguers.

That is, they fear that a superior intelligence is attempting to manipulate their beliefs through rationalization.

How easily would you be able to distinguish between a fAI trying to help you discover the truth and an unfriendly AI concocting a clever argument to lead you to the conclusion it (for whatever reason) wants for you, assuming both are vastly more intelligent than you?

Nobody actually says things like "70% probability the sky is green." It's an inconvenient truth that tradeoffs between accurate writing and effective writing are all over the place. (I think.) I wish English had dubifiers.


"Nobody actually says things like "70% probability the sky is green."

I do, all the time, because I run a prediction market. So far it's been right 57 out of 60 times. The market's ability to say 70% that [insert key business metric] is [insert major corporate strategy] allows management to make very valuable decision trees. Highly recommended.

Actually I don't know how to pronounce Eliezer's name. How do you pronounce "Eliezer Yudkowsky"?



The comments to this entry are closed.

Less Wrong (sister site)

May 2009

Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30