Followup to: The Proper Use of Humility, Tsuyoku Naritai
(At this point, I fear that I must recurse into a subsequence; but if all goes as planned, it really will be short.)
I once lent Xiaoguang "Mike" Li my copy of "Probability Theory: The Logic of Science". Mike Li read some of it, and then came back and said:
"Wow... it's like Jaynes is a thousand-year-old vampire."
Then Mike said, "No, wait, let me explain that -" and I said, "No, I know exactly what you mean." It's a convention in fantasy literature that the older a vampire gets, the more powerful they become.
I'd enjoyed math proofs before I encountered Jaynes. But E.T. Jaynes was the first time I picked up a sense of formidability from mathematical arguments. Maybe because Jaynes was lining up "paradoxes" that had been used to object to Bayesianism, and then blasting them to pieces with overwhelming firepower - power being used to overcome others. Or maybe the sense of formidability came from Jaynes not treating his math as a game of aesthetics; Jaynes cared about probability theory, it was bound up with other considerations that mattered, to him and to me too.
For whatever reason, the sense I get of Jaynes is one of terrifying swift perfection - something that would arrive at the correct answer by the shortest possible route, tearing all surrounding mistakes to shreds in the same motion. Of course, when you write a book, you get a chance to show only your best side. But still.
It spoke well of Mike Li that he was able to sense the aura of formidability surrounding Jaynes. It's a general rule, I've observed, that you can't discriminate between levels too far above your own. E.g., someone once earnestly told me that I was really bright, and "ought to go to college". Maybe anything more than around one standard deviation above you starts to blur together, though that's just a cool-sounding wild guess.
So, having heard Mike Li compare Jaynes to a thousand-year-old vampire, one question immediately popped into my mind:
"Do you get the same sense off me?" I asked.
Mike shook his head. "Sorry," he said, sounding somewhat awkward, "it's just that Jaynes is..."
"No, I know," I said. I hadn't thought I'd reached Jaynes's level. I'd only been curious about how I came across to other people.
I aspire to Jaynes's level. I aspire to become as much the master of Artificial Intelligence / reflectivity, as Jaynes was master of Bayesian probability theory. I can even plead that the art I'm trying to master is more difficult than Jaynes's, making a mockery of deference. Even so, and embarrassingly, there is no art of which I am as much the master now, as Jaynes was of probability theory.
This is not, necessarily, to place myself beneath Jaynes as a person - to say that Jaynes had a magical aura of destiny, and I don't.
Rather I recognize in Jaynes a level of expertise, of sheer formidability, which I have not yet achieved. I can argue forcefully in my chosen subject, but that is not the same as writing out the equations and saying: DONE.
For so long as I have not yet achieved that level, I must acknowledge the possibility that I can never achieve it, that my native talent is not sufficient. When Marcello Herreshoff had known me for long enough, I asked him if he knew of anyone who struck him as substantially more natively intelligent than myself. Marcello thought for a moment and said "John Conway - I met him at a summer math camp." Darn, I thought, he thought of someone, and worse, it's some ultra-famous old guy I can't grab. I inquired how Marcello had arrived at the judgment. Marcello said, "He just struck me as having a tremendous amount of mental horsepower," and started to explain a math problem he'd had a chance to work on with Conway.
Not what I wanted to hear.
Perhaps, relative to Marcello's experience of Conway and his experience of me, I haven't had a chance to show off on any subject that I've mastered as thoroughly as Conway had mastered his many fields of mathematics.
Or it might be that Conway's brain is specialized off in a different direction from mine, and that I could never approach Conway's level on math, yet Conway wouldn't do so well on AI research.
Or...
...or I'm strictly dumber than Conway, dominated by him along all dimensions. Maybe, if I could find a young proto-Conway and tell them the basics, they would blaze right past me, solve the problems that have weighed on me for years, and zip off to places I can't follow.
Is it damaging to my ego to confess that last possibility? Yes. It would be futile to deny that.
Have I really accepted that awful possibility, or am I only pretending to myself to have accepted it? Here I will say: "No, I think I have accepted it." Why do I dare give myself so much credit? Because I've invested specific effort into that awful possibility. I am blogging here for many reasons, but a major one is the vision of some younger mind reading these words and zipping off past me. It might happen, it might not.
Or sadder: Maybe I just wasted too much time on setting up the resources to support me, instead of studying math full-time through my whole youth; or I wasted too much youth on non-mathy ideas. And this choice, my past, is irrevocable. I'll hit a brick wall at 40, and there won't be anything left but to pass on the resources to another mind with the potential I wasted, still young enough to learn. So to save them time, I should leave a trail to my successes, and post warning signs on my mistakes.
Such specific efforts predicated on an ego-damaging possibility - that's the only kind of humility that seems real enough for me to dare credit myself. Or giving up my precious theories, when I realized that they didn't meet the standard Jaynes had shown me - that was hard, and it was real. Modest demeanors are cheap. Humble admissions of doubt are cheap. I've known too many people who, presented with a counterargument, say "I am but a fallible mortal, of course I could be wrong" and then go on to do exactly what they planned to do previously.
You'll note that I don't try to modestly say anything like, "Well, I may not be as brilliant as Jaynes or Conway, but that doesn't mean I can't do important things in my chosen field."
Because I do know... that's not how it works.
In a few years, you will be as embarrassed of these posts as you are today of your former claims of being an Algernon, or that a logical paradox would make an AI go gaga, the tMoL argumentation you mentioned the last days, the Workarounds for the Laws of Physics, Love and Life Just Before the Singularity and so on and so forth. Ask yourself: Will I have to delete this, too ?
And the person who told you to go to college was probably well-meaning, and not too far from the truth. Was it Ben Goertzel ?
Posted by: Manuel Moertelmaier | September 26, 2008 at 05:59 AM
Despite all fallibility of memory, I would be shocked to learn that I had ever claimed that a logical paradox would make an AI go gaga. Where are you getting this from?
Ben's never said anything like that to me. The comment about going to college was from an earnest ordinary person, not acquainted with me. And no, I didn't snap at them, or laugh out loud; it was well-intentioned advice. Going to college is a big choice for a lot of people, and this was someone who met me, and saw that I was smart, and thought that I seemed to have the potential to go to college.
Which is to imply that if there's a level above Jaynes, it may be that I won't understand it until I reach Jaynes's level - to me it will all just look like "going to college". If I recall my timeline correctly, I didn't comprehend Jaynes's level until I had achieved the level of thinking naturalistically; before that time, to achieve a reductionist view of intelligence was my whole aspiration.
Posted by: Eliezer Yudkowsky | September 26, 2008 at 06:16 AM
Although I've never communicated with you in any form, and hence don't know what it's like for you to answer a question of mine, or correct a misconception (you have, but gradually), or outright refute a strongly held belief...or dissolve a Wrong Question...
...You're still definitely the person who strikes me as inhumanly genius - above all else.
Posted by: Not You | September 26, 2008 at 06:25 AM
Unfortunately for my peace of mind and ego, people who say to me "You're the brightest person I know" are noticeably more common than people who say to me "You're the brightest person I know, and I know John Conway". Maybe someday I'll hit that level. Maybe not.
Until then... I do thank you, because when people tell me that sort of thing, it gives me the courage to keep going and keep trying to reach that higher level.
Seriously, that's how it feels.
Posted by: Eliezer Yudkowsky | September 26, 2008 at 06:33 AM
So how does it work, in your opinion? Because “I may not be as brilliant as Jaynes or Conway, but that doesn't mean I can't do important things in my chosen field,” sounds suspiciously similar to how Hamming asserts that it works in “You and Your Research.” I guess you have a different belief about how doing important things in your chosen field works, but I don't see that you've explained that belief here or anywhere else that I've seen.
I don't suppose Marcello is related to Nadja and Josh Herreshoff?
I don't know if it helps, but while I've appreciated the things I've learned from you, my limited interaction with you hasn't made me think you're the brightest person I know. I think of you as more or less at my level — maybe a couple of standard deviations above or below, I can’t really tell. Certainly you're sharp enough that I'd enjoy hanging out with you. (Let me know the next time you're in Argentina.)
P.S. the impugnment of your notability has now been removed from your Wikipedia page, apparently as a result of people citing you in their papers.
Posted by: Kragen Javier Sitaker | September 26, 2008 at 06:56 AM
Wait wait wait wait. Eliezer...are you saying that you DON'T know everything????
~runs off and weeps in a corner in a fetal position~
Posted by: lowly undergrad | September 26, 2008 at 07:19 AM
CatAI (1998): "Precautions"/"The Prime Directive of AI"/"Inconsistency problem".
My memory may fail me, and the relevant archives don't go back that far, but I recall Ben (and/or possibly other people) suggesting you going to college, or at least enroll for a grad program in AI, on the Extropy chat list around 1999/2000. I think these suggestions were related to, but not solely based on, your financial situation at that time (which ultimately led to the creation of the SIAI, so maybe we should be glad it turned out the way it did, even if, in my opinion, following the advice would have been beneficial to you and your work.)
Posted by: Manuel Mörtelmaier | September 26, 2008 at 07:59 AM
I was curious how you'd react when you eventually realized you weren't as bright as you thought you were. The journey to full comprehension isn't complete yet, but it's interesting seeing this little bit unfold. For all your disdain of modesty arguments, your life makes for a great demonstration of how one can go wrong if they go unheeded.
Posted by: Yep | September 26, 2008 at 09:05 AM
I definitely see the "levels" phenomenon very often. Most people I meet who see me play a musical instrument (or 5 or 10 different ones) think I must be a genius at music - unless they're a musician, then they recognize me as an amateur with enough money to buy interesting instruments and enough skill to get a basic proficiency at them quickly.
And even with standard measures of intellect like rationality or math... I don't know that many of my friends who have read any of this blog would recognize you as being smarter than me, despite the fact that you're enough levels above me that my opinion of you is pretty much what "Not You" said above.
I can keep up with most of your posts, but to be able to keep up with a good teacher, and to be that good teacher, is a gap of at least a few levels. But aspiring to your level (though I may not reach it) has probably been the biggest motivator for me to practice the art. I certainly won't be the one who zips by you, but you've at least pulled me up to a level where I might be able to guide one who will down a useful path.
Posted by: Eric | September 26, 2008 at 09:14 AM
Up to now there never seemed to be a reason to say this, but now that there is:
Eliezer Yudkowsky, afaict you're the most intelligent person I know. I don't know John Conway.
Posted by: Sebastian Hagen | September 26, 2008 at 09:14 AM
Posted by: Caledonian | September 26, 2008 at 09:41 AM
Your faith in math is misplaced. The sort of math smarts you are obsessed with just isn't that correlated with intellectual accomplishment. For accomplishment outside of math, you must sacrifice time that could be spent honing your math skills, to actually think about other things. You could be nearly the smartest math type guy anyone you meet know, and still not accomplish if math is not the key to your chosen subject.
Posted by: Robin Hanson | September 26, 2008 at 09:50 AM
It's interesting, actually. You're motivated by other peoples' low opinions of you -- this pressure you feel in your gut to prove Caledonian et al wrong -- so you've taken that is probably fairly standard human machinery and tried to do something remarkable with it.
My question is, are you still motivated by the doubt you feel about your native abilities, or have you passed into being compelled purely by your work?
Posted by: Ken Sharpe | September 26, 2008 at 09:50 AM
Perhaps the truly refulgent (before they had so become) reached a progression tipping point at which they realized (right or wrong, ironically) that they were essentially beyond comparison, and hence stopped comparing.
Then they could allocate the scarce resources of time and thought exclusively to the problems they were addressing, thus actually attaining a level that truly was beyond comparison.
Posted by: Atstjx | September 26, 2008 at 09:52 AM
Jaynes was a really smart guy, but no one can be a genius all the time. He did make at least one notable blunder in Bayesian probability theory -- a blunder he could have avoided if only he'd followed his own rules for careful probability analysis.
Posted by: Cyan | September 26, 2008 at 09:55 AM
You come across as very intelligent when you stick to your areas of expertise, like probability theory, AI and cognitive biases, but some of your more tangential stuff can seem a little naive. Compared to the other major poster on this blog, Robin, I'd say you come across as smarter but less "wise", if that means anything to you. I'm not even a huge fan of the notion of "wisdom", but if there's something you're missing, I think that's it.
Posted by: ShardPhoenix | September 26, 2008 at 09:58 AM
If you haven't read it, Simonton's Origins of Genius draws a nice distinction between mental agility and long-term intellectual significance, and explores the correlation between the two. Not a terribly well-written book, but certainly thought-provoking.
Posted by: Rob | September 26, 2008 at 10:21 AM
@EY: We are the cards we are dealt, and intelligence is the unfairest of all those cards. More unfair than wealth or health or home country, unfairer than your happiness set-point. People have difficulty accepting that life can be that unfair, it's not a happy thought. "Intelligence isn't as important as X" is one way of turning away from the unfairness, refusing to deal with it, thinking a happier thought instead. It's a temptation, both to those dealt poor cards, and to those dealt good ones. Just as downplaying the importance of money is a temptation both to the poor and to the rich.
How could the writer of the above words be the writer of today's post? Apparently (as I'm told) you knew from the days of the Northwestern Talent Search that you weren't the smartest of those tested (not to mention all those who were not tested), but certainly one of the smartest. Apparently, you were dealt a straight flush to the king, while some in history received a royal flush. What difference does it make whether someone thinks you are the smartest person they have known, unless you are the smartest person? Does a straight flush to the king meet the threshold required to develop a method for "saving humanity"? If not, why aren't you in the camp of those who wish to improve human intelligence? *awaits clap of thunder from those dealt better hands*
Posted by: retired urologist | September 26, 2008 at 10:40 AM
Eliezer, I've been watching you with interest since 1996 due to your obvious intelligence and "altruism." From my background as a smart individual with over twenty years managing teams of Ph.D.s (and others with similar non-degreed qualifications) solving technical problems in the real world, you've always struck me as near but not at the top in terms of intelligence. Your "discoveries" and developmental trajectory fit easily within the bounds of my experience of myself and a few others of similar aptitudes, but your (sheltered) arrogance has always stood out. I wish you continued progress, not so much in ever-sharper *analysis*, but in ever more effective *synthesis* of the leading-edge subjects you pursue.
Posted by: Jef Allbright | September 26, 2008 at 11:50 AM
How much do you worry about age 40? Is that just based on your father? Conway passed 40 before Marcello was born.
Posted by: Douglas Knight | September 26, 2008 at 11:52 AM
I'll take this one because I'm almost certain Eliezer would answer the same way.
Working on AI is a more effective way of increasing the intelligence of the space and matter around us than increasing human intelligence is. The probability of making substantial progress is higher.
Posted by: Richard Hollerith | September 26, 2008 at 11:53 AM
Wow, chill out, Eliezer. You're probably among the top 10, certainly in the top 20, most-intelligent people I've met. That's good enough for anything you could want to do. You are ranked high enough that luck, money, and contacts will all be more important factors for you than some marginal increase in intelligence.
Posted by: Phil Goetz | September 26, 2008 at 11:53 AM
First, same question as Douglas: what is it with the brick wall at 40?
Second: This is another great post, its rare for people to expose their thoughts about theirselves in such an open way. Congratulations!
Regarding your ability, I'm just a regular guy(studied Math in college) but your writings are the most inspiring I've ever read. So much self-reflection about intelligence and the thinking process. The insight about how certain mental processes feel is totally new to me. You have helped me a lot to identify my own blind spots and mistakes. Now I can look back and see exactly where I did go wrong in the past and I see with clarity where there was once confusion. I wish I've read this stuff when I was still 13 years old, maybe this could have prevented a lot of the mistakes I did later in life.
Also one of the things I learned from you is that hard work can substitute for intelligence. Think of evolution, even a stupid person can accomplish great things if he bangs his head long enough against the problem. Well, there is still the need of a basic level of intelligence, but I guess you have that.
Did you read Richard Feynman's biography? AFAIK he was also not the smartest and had moments of great self-doubt in his career where he even thought of giving up. I think this turned out to be a blessing because it forced him to visualize things in a more intuitive manner if my recollection is correct. Hence the invention of the Feynman diagrams.
Regarding College, well I went to one and it was one of the biggest wastes of time in my life, together with school. I wish I had been as smart as you and left school at the age of 12.
Posted by: Roland | September 26, 2008 at 12:34 PM
I second Robin's comment.
A friend of mine, Steve Jordan, once asked me just how smart I thought he and I were.
I answered that I think that no-one is really as smart as the two of us both think we are. You see, for many many people it is possible to choose a weighting scheme among a dozen or so factors contribute to intellectual work such that they are the best. You simply define the vector to their point on the "efficient aptitude frontier" as "real intelligence". A dozen or so people associated with this blog and/or with SIAI and a smaller number who aren't appear to me to be on points of the "known to Michael Vassar efficient aptitude frontier", though not necessarily equally mission-critical points. For my "save the world dream team" I would pick a 25-year-old Steve Jobs over a 25-year-old Terrance Tao, though I'd like both of course.
Posted by: michael vassar | September 26, 2008 at 01:20 PM
Manuel, "enroll in a grad program for AI" != "you're smart, you should go to college".
Kragen, the short answer is, "It's easy to talk about the importance of effort if you happen to be Hamming." If you can make the ante for the high-stakes table, then you can talk about how little the ante counts for, and the importance of playing your cards well. But if you can't make the ante...
Robin, it's not blind faith in math or math for the sake of impressiveness, but a specific sense that the specific next problems I have to solve, will require more math than I've used up to this point. Not Andrew J. Wiles math, but Jaynes doesn't use Wiles-math either. I quite share your prejudice against math for the sake of looking impressive, because that gets you the wrong math. (Formality isn't about Precision?)
Ken, it's exclusively my work that gives me the motivation to keep working on something for years, but things like pride can give me the motivation to keep working on something for the next minute. I'll take whatever sources of motivation I can get (er, that aren't outright evil, of course).
Douglas, yes, my father changed at 40. But one of my primary sources of hope is that people have been known to do basic research later than this if they changed fields late in life, which suggests that it actually can be a matter of approach/outlook/methodology and avoiding serving on prestigious committees.
Retired, I don't understand the apparent contradiction you see. I participated in the Midwest Talent Search at a young age (not "Northwestern" anything, maybe you're confusing with Northwestern University?) and scored second-best for my grade category, but at that point I'd skipped a grade. But I think I can recall hearing about someone who got higher SAT scores than mine, at age nine. That would be decisive, if the SAT were a perfect noiseless measurement of ability to work on AI.
Yes, this is the well-known phenomenon where asking someone "How dumb are you?" produces a different answer than "How smart are you?" because they recall a different kind of evidence. But the question I'm trying to answer is "How much potential do you have to solve the remaining FAI problems you know about?" As I said to Robin, I do think this is going to involve taking a step up in math level.
To all commenters who observed that I don't seem to stand out from 10 other smart people they know, either you didn't comprehend the entirety of today's post, or you have very high confidence that you occupy the highest possible rank of human ability.
Posted by: Eliezer Yudkowsky | September 26, 2008 at 01:54 PM
Vassar - your English is encrypted - more an assumption of intelligence than a sign.
EY - I admire your work. Along with Robin this is the best Show in Town and I will miss it, when it stops.
I actually doubt whether you are accomplishing anything - but this does not seem so important to me, because the effort itself is worthwhile. And we are educated along the way.
This is a youthful blog with youthful worries. From the vantage point of age worrying about intelligence seems like a waste of time and unanswerable to boot.
But those are the stones in your shoes.
Posted by: Marshall | September 26, 2008 at 02:00 PM
@Jef Allbright:
Can you be concrete and specific about where Eliezer is or has been arrogant?
Posted by: Roland | September 26, 2008 at 02:26 PM
"Most intelligent people I've met" is not informative, we need to give quantitative estimates. My estimate is calibrated based on knowing people who passed various screenings, such as math, physics and programming contests (including at international level), test results on screening exams to top universities, performance in hard university courses, people starting to grasp research and programming, etc. Based on population of regions covered by various screenings, and taking age, gender and different background into account, I can approximately rate these people on the "1 in XXX" scale. I'd say that you need to be at a level of 1 in 300 or so to be able to deeply understand any technical field of human knowledge given reasonable effort, and 1 in 100 to be a competent technical specialist. There is a significant difference (which can cash out as, say, 3x speedup at obtaining given level of aptitude) between people who are 1 in 1000 and 1 in 10000. I know too few people beyond 1 in 10000 (about top 30 in a contest over population of 20 million within a 3-year age interval, given average lifespan of 60 and background selection of 1 in 3 top people to enter the contest) to say whether there is a general enough advantage of being there, or if the performance levels off and more rarely occurring extraordinary ability only presents itself on very narrow task, like blasting through programming contests.
People at all levels are stupid at unknown domains, it takes much effort to start demonstrating "raw intelligence" at anything (although in many things skills partially translate between domains). You can't learn to be creative enough if you don't pass a necessary threshold, but on the other hand if you are past it, sufficient effort will make you able to solve any problem other people can solve, although it'll take more time to attain that level and to solve problems once you did that.
The main problem for getting results is that it's very hard to port smart people on a new field of expertise, to convince them to start thinking about something or to actively work on improving their performance in a given domain. So, it seems that the main problem with seeing (or finding) enough brilliant people in any given field or group is not in the rarity of talent, but in the roads they all took, too few of which lead where you look.
People won't risk working on hard important problems or even think too much about exploring which problems could be important, they choose convenient, safe or enjoyable paths, they choose intellectual dynamic, the process, rather than proper understanding of results or appearance. People you hear from are not the smartest there are in a given subject.
I estimate myself to be around 1 in 1000, more specifically a somewhat blinder, slower and faulty-memory version of 1 in 5000 (as I understand, it's not how many other people perceive their limitations). I clearly see the advantages that people with clearer minds get, but as far as I can tell I'm still able to excel at anything if I devote enough attention to it, given enough time. Extraordinary intellectual productivity is a result of taking the right road, which may depend on happenstance beyond your control. Digging yourself from the pit of blind stupidity (relatively), of seeing only a surface level and stopping the investigation there, is the most important thing (which is what the art of rationality is about, not being stupid, using what you've got, while it falls short of understanding intelligence deeper).
From what I've read, I think that Eliezer is somewhere at 1 in 5000 on this scale, given time he devoted to the study of the subjects and results he produced. He stands out in comparison mainly because too few smart enough people engage in the questions he addresses from the same side, and of those who do hardly anybody devoted much serious thought to them and at the same time didn't get lost on a false road. You don't see the absence of talent, but the initial stupidity in unfamiliar domain or entrenched mistakes where there isn't a solid body of knowledge and authority to force them out.
So, I think that his estimate of 1 in 10000-100000 is too high. The problem is more of convincing the right people to work on the problem and pointing them to the right path, rather than of finding the right people at all. Having an introductory text showing the path is a huge asset, so the decision to compose this book might be a fruitful one.
Posted by: Vladimir Nesov | September 26, 2008 at 02:36 PM
My own potential intelligence does worry me fairly often. I am currently studying to become an engineer and hope to work on some of the awesome ideas I read about on sites like this. The thing is though, I wasted the first twenty third years of my life. I am currently at twenty-five years old and I have been forced to pretty much start from scratch on everything from social skills to education and after two years I think I am making some headway. I am even starting to understand what Eliezer talks about in all these posts and apply it to my own life as best I can. The math still escapes me but I managed to make it through about half of the Bayesian explanation before getting completely and utterly lost. So I think it is certainly possible learn a huge amount of things even after young childhood but it is rather less efficient. I have had to really struggle to get to where I am now. And since Eliezer is one of the big reasons I am so excited about getting into science I would like to attempt to work in a similar field of research. Yep.
Posted by: Cassandra | September 26, 2008 at 03:01 PM
Let me give a shout out to my 1:50 peeps! I can't even summarize what EY has notably accomplished beyond highlighting how much more likely he is to accomplish something. All I really want is for Google to stop returning pages that are obviously unhelpful to me, or for a machine to disentangle how the genetic code works, or a system that can give absolute top notch medical advice, or something better than the bumbling jackasses[choose any] that manage to make policy in our country. Give me one of those things and you will be one in a million, baby.
Posted by: Aron | September 26, 2008 at 03:12 PM
@Roland
I suppose you could google "(arrogant OR arrogance OR modesty) eliezer yudkowsky" and have plenty to digest. Note that the arrogance at issue is neither dishonest nor unwarranted, but it is an impairment, and a consequence of trade-offs which, from within a broader context, probably wouldn't be taken in the same way.
That's as far as I'm willing to entertain this line of inquiry, which ostensibly neutral request for facts appears to belie an undercurrent of offense.
Posted by: Jef Allbright | September 26, 2008 at 03:22 PM
Okay, I realize you're going to read that and say, "It's obviously not good enough for things requiring superhuman intelligence!"
I meant that, if you compare your attributes to those of other humans, and you sort those attributes, with the one that presents you the most trouble in attaining your goal at the top, intelligence will not be near the top of that list for you, for any goal.
Posted by: Phil Goetz | September 26, 2008 at 03:39 PM
Eliezer,
what's with the ego?
In other words - why are you so driven?
I gather from your posts that you have metaphysical views which make you believe that solving the FAI problem is the most important thing you should be doing.
But is it really that important that you are the one to bring this work to fruition?
Do you think your life will have been unfulfilled, or your opportunity wasted, if you don't finish this, and finish it as soon as you can?
Would building an exceptional foundation, which future exceptional people can improve on, not be achievement enough?
What does it matter how smart you are, if you are doing what you love, and giving it your best effort?
Perhaps it is the fear of being too late that is causing you distress. Perhaps you fear that humanity is going to be destroyed because you didn't build an FAI soon enough. Perhaps you fear that your life will end some 10,000 years sooner than you'd like.
But it is not your responsibility to save the world. It can be fun if you contribute to the effort. But planets are a dime a dozen, and lives are even cheaper than that. We are not really that important. No one is. In the grand scheme of things, our dramas and concerns are lightweight fun.
One of the problems of always being the top banana is that you never learn to realize that you don't have to be the top banana to be fulfilled in your life.
There's no need to worry so much about being on Jaynes's or Conway's level. Do what you do best, and do it because it's fun. If you've been given what it takes, then this is the fastest way to become the master of your field. And even if you didn't have what it takes - which in your case is unlikely - you would still be making a contribution and having fun.
Posted by: denis bider | September 26, 2008 at 03:39 PM
@Jef Allbright:
I suppose you could google "(arrogant OR arrogance OR modesty) eliezer yudkowsky" and have plenty to digest.
Well, I was asking you, not google. But it seems that you are not willing to stand behind your words, making claims then failing to provide evidence when asked. Refering to a third party is an evasive maneuver. Show us your cards!
That's as far as I'm willing to entertain this line of inquiry, which ostensibly neutral request for facts appears to belie an undercurrent of offense.
That's your supposition.
Posted by: Roland | September 26, 2008 at 03:47 PM
Eliezer, can you clarify what you mean by
"You'll note that I don't try to modestly say anything like, "Well, I may not be as brilliant as Jaynes or Conway, but that doesn't mean I can't do important things in my chosen field."
Because I do know... that's not how it works."
Posted by: Robin | September 26, 2008 at 03:53 PM
Vladimir Nesov: thanks for your comment. I found it insightful.
Posted by: denis bider | September 26, 2008 at 03:55 PM
You say 'That's not how it works.' But I think that IS how it works!
If progress were only ever made by people as smart as E.T. Jaynes, humanity would never have gotten anywhere. Even with fat tails, intelligence is still roughly normally distributed, and there just aren't that many 6 sigma events. The vast majority of scientific progress is incremental, notwithstanding that it's only the revolutionary achievements that are salient.
The real question is, do you want Friendly A.I. to be achieved? Or do you just want friendly A.I. to be achieved by YOU? There's no shame in the latter one, but the preclusion of the latter speaks little about progress towards the former (which I happen to think this blog is immensely valuable towards).
Posted by: behemoth | September 26, 2008 at 04:12 PM
I find myself, except in the case of people with obvious impairments, completely unable to determine how intelligent someone is by interacting with them. Sometimes I can determine who is capable of performing specific tasks, but I have little confidence in my ability to assess "general intelligence".
To some extent, this is because different people have acquired different skills. Archimedes of Syracuse may have been the greatest mathematician in history, but he wouldn't be able to pass the exams in a high school calculus class. Obviously, the reason he couldn't solve these math problems is not that he isn't as intelligent as today's high school students. It's because he never had a calculus textbook.
If you had two black boxes, one of which contained a 14-year-old who scores in the 98th percentile on IQ tests, and the other contained the median college graduate with a degree in some technical field, such as electrical engineering, which black box would appear more intelligent?
It's hard to tell the difference between someone who is actually smarter and someone who has simply learned more. One thing that I learned how to do very well, which contributed greatly to much of my academic success, is translate "word problems" into mathematical equations. There's a systematic way to do this that works on just about any (reasonable) textbook, and it's a task that that I found many of my fellow high school students having trouble with in my science classes.
To what extent is "intelligence" simply a matter of having already learned the best ways to learn?
Posted by: Doug S. | September 26, 2008 at 04:37 PM
Also...
I believe that you don't really understand something until you can explain it to someone else, and have them understand it, too.
Posted by: Doug S. | September 26, 2008 at 04:41 PM
There's basically two reasons to get called arrogant. One is acting like you're better when you aren't. The other is refusing to politely pretend that the inferential chasm is small. Given where E is and where the mass of humanity are, if I had to make blind-guess assignments for 100 accusers picked at random, and I assigned them all into the "inferential distance" bin, I don't think I'd be wrong once. So, a person asking to be put, or to put some accuser into the "undeserved airs" bin, had better show some sharp evidence!
Posted by: Julian Morrison | September 26, 2008 at 04:55 PM
"Math is a game for the young."
Posted by: Zubon | September 26, 2008 at 05:24 PM
"Perhaps it is the fear of being too late that is causing you distress. Perhaps you fear that humanity is going to be destroyed because you didn't build an FAI soon enough. Perhaps you fear that your life will end some 10,000 years sooner than you'd like."
Humanity's alleged demise is not the only possible way he could be too late. I wonder where Eliezer would turn his attention if someone (or some group) solved the problems of FAI before him.
Eliezer has written a number of times about how comparing your intelligence and rationality to those around you is pointless (e.g. it's never good enough to be good in comparison, etc.). This philosophy has thus far been directed at comparing one's self to lower levels of cognition - but I don't see why it shouldn't work bottom up also. Learn from the levels above you, but do not lionize them. As we all aspire to embody the higher levels, I'm sure Jaynes must have also (an old vampire, but not old enough).
Eliezer: I don't think we should worry about our particular positions on the bell curve, or set goals for where we want to be. Don't fret over the possible limitations of your brain, doing so will not change them. Just work hard and try your best, always attempt to advance - push the limitations. Jaynes was struggling against his meat-brain too. It's human - you both surpassed the village idiots and college professors, now the difference in levels becomes more and more negligible with each step approaching the limit. Everybody is working with meat designed by the "idiot god". Push it to the limit, hate the limit, but don't be self-conscious about it.
We all wish we had gotten an earlier start on things. The importance of them is perhaps something you have to learn as you grow.
Posted by: Peter | September 26, 2008 at 05:41 PM
Eliezer: It seems to me that uncertainty about your abilities is dwarfed by uncertainty about the difficulty of the problem.
Doug S: The median college graduate in a technical field probably would test in the 95th percentile on most IQ tests and at the 98th percentile on tests weighted heavily towards non-vocabulary crystalline g
Posted by: michael vassar | September 26, 2008 at 07:00 PM
Eliezer: Not sure to what extent this helps or answers your questions, but I increasingly as of late find that much of my current "cached wisdom" seems to be derived from stuff you've said.
As far as as actually finding the next generation or whatever, maybe some people here that know how ought to start some "private school for the gifted" that explicitly is meant to try to act almost like a Bayes Dojo or whatever and otherwise train up people in really precise thinking?
Posted by: Psy-Kosh | September 26, 2008 at 08:09 PM
While Conway has a huge jump on you in mathematical ability, and I'm pretty sure you're not going to catch up to him, rest assured that you are not strictly dumber than Conway in every respect.
You should bear in mind how the statement "Maybe anything more than around one standard deviation above you starts to blur together, though that's just a cool-sounding wild guess" might apply to me. If your guess is literally true, then, because math is my strong-suit, high mathematical ability is the smartest kind of smart that I can detect at all. For me, philosophical ability and the like would blur into "go to college"-land sooner.
In terms of philosophical intuition, you are head and shoulders above Conway. Remember Conway's "Free will theorem" (a brilliant piece of math to be sure, but very misleadingly named.) Yet, you report never having been confused about free will. My sense of awe at your philosophical intuition has only increased after reading the overcoming bias posts. It's doubly impressive to me, because I keep realizing that you are making explicit more of the helpful little nudges you gave me over the course of our work together, and I am impressed at how helpful some of these things were in practice, and your ability to communicate things which seemed so elusive so clearly. I'm not sure how much of that was native intelligence and how much was starting with a good ideas in your mental toolbox, but I could ask the same thing about Conway.
Posted by: Marcello | September 26, 2008 at 09:22 PM
Eliezer: Look on the bright side, you haven't yet relegated yourself to being a mere administrator and occasional sounding board for others' AI research projects! Ego subjugation is a bitch, but it can have minor rewards of self-satisfaction when actions driven by pressure-free buckshot mental synthesis actually bear fruit. I don't envy that it's of no help to you that the luxury of being carefree relies on the knowledge that smarter people are doing the heavy lifting, and today you're at the top tier of that brain chain!
Posted by: Dave | September 26, 2008 at 10:02 PM
Eliezer: Look on the bright side, you haven't yet relegated yourself to being a mere administrator and occasional sounding board for others' AI research projects! Ego subjugation is a bitch, but it can have minor rewards of self-satisfaction when actions driven by pressure-free buckshot mental synthesis actually bear fruit. I don't envy that it's of no help to you that the luxury of being carefree relies on the knowledge that smarter people are doing the heavy lifting, and today you're at the top tier of that brain chain!
Posted by: Dave | September 26, 2008 at 10:02 PM
Michael Vassar - With regard to IQ of college graduates, not so. According to Richard Lynn, it hovers around average, 100-110 or so. This is with the exception of mathematics and physics, which require much more.
I second Doug S.'s emphasis that it's difficult to tell people's intelligence just by talking to them. Intelligence is the speed of mind, but what you're measuring is the position. I used to have casual insight into people's IQ test results (culturally independent, non-verbal), and I was frequently surprised. The people I thought (and still do) were smart and interesting often had lower than expected results.
Posted by: denis bider | September 26, 2008 at 11:09 PM
Random question.
Who do you think was smarter: Shakespeare or Galileo?
Posted by: Doug S. | September 27, 2008 at 12:33 AM
This post finally convinced me to ask you for advice, Eliezer.
I am a soon to be 18 year old freshman at Harvey Mudd College, and I've been reading OB as well as some of your former writings for some time (over a year for sure). Ever since I got here I could feel an energy from people around me that I'd barely ever seen at my average high school. I can see that if they really paid attention to your ideas, many of them would come to see the challenges of FAI as one of the most important ignored problems today. If they really tried to understand what you were saying, they would see the light.
But, the program here leaves almost no time for new reading (especially since everyone has their preferred blogs, authors, etc already) and I haven't been successful at convincing even my friends here of paying real attention to your writings. I've told them some of your most enlightening ideas, your reducing-away of confusing problems, and even your stance on the future of humanity, but they all find some convenient excuse not to pay too much attention to it and instead stick to the comfortable path of trying to become an engineer or go to grad school or any number of things - just as long as they don't have to get uncomfortable and reevaluate their whole world view.
I've had to face this problem too, but with this post it's finally clear to me that I can't just stand by and do nothing. I am still torn because I love doing all sorts of things that would never help solve the problems of FAI. In particular I love studying languages - so much so that I always have an electronic Japanese-English-Chinese dictionary in my pockst - and I would say I'm pretty good at math and the sciences. But unless I get my act together and try to present FAI outside of English-speaking circles, I doubt it will do too much good.
If I could find a way of presenting this to the brilliant audience here, I feel it would do far more than just my own best efforts. To this end, what have you found effective in convincing an audience of far-above-average college students and professors, who had never heard of cognitive biases and who've probably never seriously contemplated the Singularity, of the weight of the FAI problem?
If you feel the comments here aren't the best place, please email me.
Posted by: Maksym Taran | September 27, 2008 at 02:38 AM