« Complex Novelty | Main | 'Futarchy' is NYT Buzzword of '08 »

December 20, 2008


Ultimately, expertise is never about placing a lamp to illuminate the answer, but rather, directing the broadening beam of a searchlight illuminating the space of possibilities in which an answer might be found.

The former entails the unreasonable expectation that one might be able to triangulate, despite the agent being within the space to be observed. The latter expresses more coherently the relationship of the agent to the observed, with instrumental effectiveness increasing despite—indeed by virtue of—even more rapidly increasing uncertainty.

Of course, this view may violate the intuitions of those steeped in the (virtual) reality of video-gaming, where the ultimate context is in fact knowable, expertise does in fact triangulate rather than merely illuminate, and increasing intelligence of the game does in fact entail increasing certainty.

But who [here on OvercomingBias] would argue against such intuitions, such logical and empirical facts?

s/merely illuminate/merely illuminate a way forward

Prediction markets are not one person or even a defined group.

Also a web-site could be set up where known experts take public very well-defined bets that get paid with virtual money, say a rating. A bit like the online chess community. Experts could then be ranked into Master, Grandmaster etc...

I don't fault the experts. I fault the crooks and criminals. Because of them we all feel uncertain. We are not secure because we have no control over the crooks to bring them to justice. They have taken over our markets and our government. Greenspan is an expert. An expert at being a criminal.

Robin: Your general principle that X is not about Y seems valid to me, yet I don't think you take it far enough. Specifically, it seems to me that even when X is not about Y you continue to assume that Y comes from X, for instance, that justified true beliefs about technical matters largely come from academia even though academia is not about justified true beliefs about technical matters. Why not assume that when X is not about Y, Y largely doesn't come from X, especially when X is about claiming to be about Y and thus probably about claiming credit for Y?

Michael, if justified true beliefs about technical matters don't largely come from academia, what's the distribution across sources? Industry? Government?

The purpose of experts is to hold and communicate the current state of human understanding.

Sometimes, like if you ask a physicist about a behavior of a mechanical system, the understanding is very precise, and allows for predictions to be made.

Sometimes, perhaps in aspects of using psychology to interpret/project human behavior, the understanding narrows a situation into a set of possibilities.

Sometimes understanding is limited to what is *not* true.

And, sometimes, the value in an expert is simply to make clear that no one really understands something.

All of these things, I think, are valuable. For example, even if economists can't agree on aspects of the present financial crisis and how to deal with it, it helps enormously just to know that there is not expert agreement: this informs you that you shouldn't sacrifice too much to follow a single direction just because someone seems confident about it. In fact, however, even though there might be disagreement among economists, presumably there are some things that they all agree would be very bad (like, for example, dramatically increasing interest rates right now). One shouldn't underestimate the value in this.

Michael, no common area of life is about justified true beliefs, so someone seeking a source of such things has no choice but to look at areas that are about other things. For topics without prediction markets, what else would you suggest beyond academia?

An expert is someone with top skill in a subject, or knowledge of it. As I read Robin's post, Malcolm's latest piece for the New Yorker came instantly to mind, where he describes the struggles to identify experts, using the examples of NFL quarterback, teacher, and financial consultant.

There he says:"We're used to dealing with prediction problems by going back and looking for better predictors." But then he notes in the case of quarterback and teacher that neither IQ tests, college performance (on the field or in the classroom), or grades in school predict success in these roles, that is, expertise.

He continues with 2 more points relevant to this discussion: "The job they are being groomed for is so particular and specialized there is no way to know who will succeed at it and who won't," and "Test scores, graduate degrees, and certifications. . .turn out to be as useful in predicting success as having a quarterback throw footballs into a bunch of garbage cans."

In the same issue, David Samuels describes a man, a Wisconsin truck driver, who became a recognized expert in the construction of Fat Man and Little Boy, so much so that scientific historians and LANL folks revere his knowledge. Here we see the characteristics it does take to actually become an expert, instead of the failures of what society thinks it takes to be an expert on display in Malcolm's piece.

An issue we as a society have is that we are not so good at choosing the truly skilled or knowledgeable; we have difficulty finding them, and as Malcolm suggests, our supposed predictors of expertise don't turn out to be so useful. And this is an excellent argument for prediction markets - not only does information aggregation offer accurate answers, but the true experts self-identify. The market finds them for us.

Comparing Malcolm's piece with the Samuels' article I think goes some way to answer Robin's question of how we can tell when we want experts for info and not just to socially certify certainty by adding weight to the in-group's prejudices.

So my first suggestion is that when we approach an "expert" with a question best answered with a model, table, chart or algorithm, and they present such, and we accept that, then we seek info.

For other questions, if the expert offers same but we reject it, we are seeking mere social certainty or in-group confirmation.

I illustrate this by recalling the late political commentator Tim Russert, for example, who made such waves on TV by daring to present a hand-drawn table on a tiny whiteboard to answer election questions, a first. Russert was presenting election results in a time of uncertainty when the public truly needed a real answer. The whiteboard was later enshrined in the Smithsonian as a sign of its historical import, even of the reverence Americans had for it.

Contrast this with the recent election when CNN's Anderson Cooper flashed around a 3D pop-up chart. Like Russert, Cooper was also providing real information in a time of uncertainty when the public sought a place to turn. His chart was admired for its tech, but mocked by comedian John Stewart for its empty theater. That shows the case in which Cooper was expected only to provide social certainty.

(I use this only as a recent situation probably familiar to many here; I am not making any particular political statement myself with this example, and I conclude by invoking Against Disclaimers.)

Another example, perhaps less familiar to a general audience, is the use of management consultants. "They come in, interview you, and then charge you a fortune to tell you what you already know," is the standard rap against fancy management consultants. But yet executives hire them all the time anyway - because of course, they do seek is social certainty.

Finally to answer Robin's other question, how the desire to consult experts for social confirmation cuts against the use of prediction markets, I once again note that I have successfully instituted markets, so I question that there is much resistance when the benefits of markets are well-explained.

But in the spirit of the question, of course, the market can turn up nonconfirming answers, which could threaten the social order and thus the inner workings of a corporation or government more than just losing a moderate amount of money with a wrong answer would.

So this "confirmation expertise" preference could easily play a role in resistance to market usage.

Carl and Robin: If there's not a common area of life about justified true beliefs I would expect that justified true beliefs come from hobbyists working largely outside of whatever system they are nominally embedded in, for instance, from scientists who work within academia because they were drawn to it because of it nominally being about justified true beliefs but who have subsequently recognized the need to compartmentalize their life between the search for justified true beliefs which they value intrinsically (and would do on their own if there was no academia) and the academic status optimization that they suspect contributes nothing socially useful.

I am reminded of the person who donates $100K to 3rd world disease prevention and buys a $100K Mercedes to effectively get the status and social benefits which both contribute to his utility function rather than donating $200K to the Opera which would get him strictly less of both, but which is salient to him because it is labeled as a single action and better fulfills his utility function than would anything else labeled as a single action. I suspect that the people who are most frustrated by academia are likewise those who wish to do worthwhile work but fail to compartmentalize their actions and spend all their time doing non-optimally worthwhile and simultaneously non-optimally academic status promoting work.

My main speculative hypothesis here is that most advance is in academia but not of it, e.g. is done by people who would actually contribute similarly much to knowledge, possibly less possibly more if academia didn't exist and they did their work as gentleman scientist hobbyists while putting most of their time into making money to fund their own work.


If academia has succeeded in sweeping up most of the amateurs who would advance science in its absence, then on your model isn't it true after all that academics are the source of most justified true beliefs? You're just advancing a hypothesis about why academia works as well as it does, not contesting its status as source of justified true beliefs about technical matters at all.

Carl: Sort of. I would say that the above is sort of like claiming that babies come from hospitals. The hospitals aren't responsible for the babies, which would exist without the hospitals. The spacial origin is being correctly attributed but the cause is not.

Michael, hobbies are not about truth either.

Obviously most hobbies are not about truth, but some may be. Generally speaking, the "principle of universal cross-domain incompetence" is self-undermining, or more specifically, strongly tends to undermine the process that leads to belief in it. By contrast, the "principle of nearly universal cross-domain incompetence" is potentially robust. For instance, after thousands of papers show the Wason task to be failed by most participants we still believe in the competence of the people who set up the task and judge the canonical answer to be correct. The task undermines our belief that "formal operations can be executed by most American adults" but not the belief that "executing formal operations is possible".

Humans have ares of life where the experts' findings and recommendations go counter to their intutions, desires etc. Such areas are like morals, sex, fidelity AIDS selfishness, healthy nutrition and lifestyle, exercise.

When people ate balling temptations and desires in the above-mentioned ares; the last thing they want on their minds is certainty - whatever clarity it can be from experts - as the dissonance will spoil the gratifications from those activities. People will actual suppress all certainty and rationalise doubt of the experts in order to go ahead and derive gratifications from the wayward activities.

The last thing one wants is the clarity and certainty in advice against what one wants to get sensational gratification from

The problem with expertise is that when you realize you know more than others, you begin to think you know everything.

Worse, even if you're aware of how little you know, other people treat you as if you knew, or ought to know, everything.

Expertise is worthwhile, but it does breed overconfidence which, if unchecked, counteracts the expertise.

How did we get this far without mentioning Nassim Nicholas Taleb (of "Black Swan" & "Fooled by Randomness" fame)? Hasn't he said this, and more, many years ago, in spades?

The comments to this entry are closed.

Less Wrong (sister site)

May 2009

Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30