« Who To Blame | Main | Overconfidence is Stylish »

September 18, 2008

Comments

Nice.

"you cannot come up with clever reasons why the gaps in your model don't matter." Sure, sometimes you can't, but sometimes you can; sometimes there are things which seem relevant but which are genuinely irrelevant, and you can proceed without understanding them. I don't think it's always obvious which is which, but of course, it's a good idea to worry about falsely putting a non-ignorable concept into the "ignorable" box.

Now it's getting interesting. I finally understand what you were trying to say by your morality posts, which, I admit, I was unable to digest (I prefer to know where I'm going when I cross inferential distances). Please be sure you do a good post or two on your "Bayesian enlightenment". I still vividly remember how profound was the impact of my own "Evolutionary enlightenment" on my earlier self.

"Please be sure you do a good post or two on your 'Bayesian enlightenment'. I still vividly remember how profound was the impact of my own 'Evolutionary enlightenment' on my earlier self."

Mine was a "compatibilist enlightenment," when I stopped believing in the silly version of free will. Thanks, Wikipedia!

Eliezer, I think you have dissolved one of the most persistent and venerable mysteries: "How is it that even the smartest people can make such stupid mistakes".

Being smart just isn't *good* *enough*.

"A theory which is not refutable by any conceivable event is non-scientific. Irrefutability is not a virtue of a theory (as people often think) but a vice." - Karl Popper, Conjectures and Refutations

Popper is traditional rationalism, no? I don't see young Eliezer applying it.

And with the Singularity at stake, I thought I just had to proceed at all speed using the best concepts I could wield at the time, not pause and shut down everything while I looked for a perfect definition that so many others had screwed up...

In 1997, did you think there was a reasonable chance of the singularity occurring within 10 years? From my vague recollection of a talk you gave in New York circa 2000, I got the impression that you thought this really could happen. In which case, I can understand you not wanting to spend the next 10 years trying to accurately define the meaning of "right" etc. and likely failing.

Eliezer, I think you have dissolved one of the most persistent and venerable mysteries: "How is it that even the smartest people can make such stupid mistakes".

Michael Shermer wrote about that in "Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time". In the question of smart people believing weird things, he essentially describes the same process as that Eliezer experienced: once smart people decide to believe a weird thing for whatever reason, it's much harder to to convince them that their beliefs are flawed because they are that much better at poking holes in counterarguments.

Avast! But 'ought' ain't needin' to be comin' from another 'ought', if it be arrived at empirically. Yar.

once smart people decide to believe a weird thing for whatever reason, it's much harder to to convince them that their beliefs are flawed because they are that much better at poking holes in counterarguments.

That's not quite it -- if they were rational, and the counterarguments were valid, they would notice the contradiction and conclude that their position was incorrect.

The problem with smart people isn't that they're better at demolishing counterarguments, because valid counterarguments can't be demolished. The problem with smart people is that they're better at rationalization: convincing themselves that irrational positions are rational, invalid arguments are valid, and valid invalid.

A mind capable of intricate, complex thought is capable of intricate, complex self-delusion. Increasing the intricacy and complexity doesn't lead to revelation, it just makes the potential for self-delusion increase.

It's not intelligence that compensates for the weaknesses in intelligence. People who think that cleverness is everything do not cultivate perception and doubt. There's a reason foxes are used as a symbol of error in Zen teachings, after all.

Eliezer, you have previously said that rationality is about "winning" and that you must reason inside of the system you have, ie our human brains. Is there a core thought or concept that you would recommend when approaching problems such as how to define your own goals? That is to say how do you improve goal systems without some goal that is either precedent or is the goal in question? I suppose that really there are no exterior judges of the performance of your goals, only your own interior performance metrics which are made by the very same goals you are trying to optimize. That doesn't seem to dissolve my confusion, just deepen it.

We all know what happened to Donald Rumsfeld, when he went to war with the army he had, instead of the army he needed.

Sorry Eliezer, but when it comes to politics you are often wrong. AFAIK Donald Rumsfeld is doing fine and made a lot of money with the war, as did many others in power. Using your words: he is smiling from the top of a giant heap of utility. Do you really think he cares about the army or Iraq?

The comments to this entry are closed.

Less Wrong (sister site)

May 2009

Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31