Followup to: Entangled Truths, Contagious Lies
If you once tell a lie, the truth is ever after your enemy.
I have previously spoken of the notion that, the truth being entangled, lies are contagious. If you pick up a pebble from the driveway, and tell a geologist that you found it on a beach - well, do you know what a geologist knows about rocks? I don't. But I can suspect that a water-worn pebble wouldn't look like a droplet of frozen lava from a volcanic eruption. Do you know where the pebble in your driveway really came from? Things bear the marks of their places in a lawful universe; in that web, a lie is out of place.
What sounds like an arbitrary truth to one mind - one that could easily be replaced by a plausible lie - might be nailed down by a dozen linkages to the eyes of greater knowledge. To a creationist, the idea that life was shaped by "intelligent design" instead of "natural selection" might sound like a sports team to cheer for. To a biologist, plausibly arguing that an organism was intelligently designed would require lying about almost every facet of the organism. To plausibly argue that "humans" were intelligently designed, you'd have to lie about the design of the human retina, the architecture of the human brain, the proteins bound together by weak van der Waals forces instead of strong covalent bonds...
Or you could just lie about evolutionary theory, which is the path taken by most creationists. Instead of lying about the connected nodes in the network, they lie about the general laws governing the links.
And then to cover that up, they lie about the rules of science - like what it means to call something a "theory", or what it means for a scientist to say that they are not absolutely certain.
So they pass from lying about specific facts, to lying about general laws, to lying about the rules of reasoning. To lie about whether humans evolved, you must lie about evolution; and then you have to lie about the rules of science that constrain our understanding of evolution.
But how else? Just as a human would be out of place in a community of actually intelligently designed life forms, and you have to lie about the rules of evolution to make it appear otherwise; so too, beliefs about creationism are themselves out of place in science - you wouldn't find them in a well-ordered mind any more than you'd find palm trees growing on a glacier. And so you have to disrupt the barriers that would forbid them.
Which brings us to the case of self-deception.
A single lie you tell yourself may seem plausible enough, when you don't know any of the rules governing thoughts, or even that there are rules; and the choice seems as arbitrary as choosing a flavor of ice cream, as isolated as a pebble on the shore...
...but then someone calls you on your belief, using the rules of reasoning that they've learned. They say, "Where's your evidence?"
And you say, "What? Why do I need evidence?"
So they say, "In general, beliefs require evidence."
This argument, clearly, is a soldier fighting on the other side, which you must defeat. So you say: "I disagree! Not all beliefs require evidence. In particular, beliefs about dragons don't require evidence. When it comes to dragons, you're allowed to believe anything you like. So I don't need evidence to believe there's a dragon in my garage."
And the one says, "Eh? You can't just exclude dragons like that. There's a reason for the rule that beliefs require evidence. To draw a correct map of the city, you have to walk through the streets and make lines on paper that correspond to what you see. That's not an arbitrary legal requirement - if you sit in your living room and draw lines on the paper at random, the map's going to be wrong. With extremely high probability. That's as true of a map of a dragon as it is of anything."
So now this, the explanation of why beliefs require evidence, is also an opposing soldier. So you say: "Wrong with extremely high probability? Then there's still a chance, right? I don't have to believe if it's not absolutely certain."
Or maybe you even begin to suspect, yourself, that "beliefs require evidence". But this threatens a lie you hold precious; so you reject the dawn inside you, push the sun back under the horizon.
Or you've previously heard the proverb "beliefs require evidence", and it sounded wise enough, and you endorsed it in public. But it never quite occurred to you, until someone else brought it to your attention, that this provreb could apply to your belief that there's a dragon in your garage. So you think fast and say, "The dragon is in a separate magisterium."
Having false beliefs isn't a good thing, but it doesn't have to be permanently crippling - if, when you discover your mistake, you get over it. The dangerous thing is to have a false belief that you believe should be protected as a belief - a belief-in-belief, whether or not accompanied by actual belief.
A single Lie That Must Be Protected can block someone's progress into advanced rationality. No, it's not harmless fun.
Just as the world itself is more tangled by far than it appears on the surface; so too, there are stricter rules of reasoning, constraining belief more strongly, than the untrained would suspect. The world is woven tightly, governed by general laws, and so are rational beliefs.
Think of what it would take to deny evolution or heliocentrism - all the connected truths and governing laws you wouldn't be allowed to know. Then you can imagine how a single act of self-deception can block off the whole meta-level of truthseeking, once your mind begins to be threatened by seeing the connections. Forbidding all the intermediate and higher levels of the rationalist's Art. Creating, in its stead, a vast complex of anti-law, rules of anti-thought, general justifications for believing the untrue.
Steven said, "Promoting less than maximally accurate beliefs is an act of sabotage. Don't do it to anyone unless you'd also slash their tires." Giving someone a false belief to protect - convincing them that the belief itself must be defended from any thought that seems to threaten it - well, you shouldn't do that to someone unless you'd also give them a frontal lobotomy.
Once you tell a lie, the truth is your enemy; and every truth connected to that truth, and every ally of truth in general; all of these you must oppose, to protect the lie. Whether you're lying to others, or to yourself.
You have to deny that beliefs require evidence, and then you have to deny that maps should reflect territories, and then you have to deny that truth is a good thing...
Thus comes into being the Dark Side.
I worry that people aren't aware of it, or aren't sufficiently wary - that as we wander through our human world, we can expect to encounter systematically bad epistemology.
The "how to think" memes floating around, the cached thoughts of Deep Wisdom - some of it will be good advice devised by rationalists. But other notions were invented to protect a lie or self-deception: spawned from the Dark Side.
"Everyone has a right to their own opinion." When you think about it, where was that proverb generated? Is it something that someone would say in the course of protecting a truth, or in the course of protecting from the truth? But people don't perk up and say, "Aha! I sense the presence of the Dark Side!" As far as I can tell, it's not widely realized that the Dark Side is out there.
But how else? Whether you're deceiving others, or just yourself, the Lie That Must Be Protected will propagate recursively through the network of empirical causality, and the network of general empirical rules, and the rules of reasoning themselves, and the understanding behind those rules. If there is good epistemology in the world, and also lies or self-deceptions that people are trying to protect, then there will come into existence bad epistemology to counter the good. We could hardly expect, in this world, to find the Light Side without the Dark Side; there is the Sun, and that which shrinks away and generates a cloaking Shadow.
Mind you, these are not necessarily evil people. The vast majority who go about repeating the Deep Wisdom are more duped than duplicitous, more self-deceived than deceiving. I think.
And it's surely not my intent to offer you a Fully General Counterargument, so that whenever someone offers you some epistemology you don't like, you say: "Oh, someone on the Dark Side made that up." It's one of the rules of the Light Side that you have to refute the proposition for itself, not by accusing its inventor of bad intentions.
But the Dark Side is out there. Fear is the path that leads to it, and one betrayal can turn you. Not all who wear robes are either Jedi or fakes; there are also the Sith Lords, masters and unwitting apprentices. Be warned, be wary.
As for listing common memes that were spawned by the Dark Side - not random false beliefs, mind you, but bad epistemology, the Generic Defenses of Fail - well, would you care to take a stab at it, dear readers?
The most dangerous dark side meme I can think of is the idea of sinful thoughts: that questioning one's faith is itself a sin even if not acted upon. A close second is "don't try to argue with the devil -- he has more experience at it than you".
Posted by: Daniel Franke | October 17, 2008 at 08:20 PM
Not all who wear robes are either Jedi or fakes
What do you mean by "wear robes"? Could we move away from references to fictional stories?
Posted by: TGGP | October 17, 2008 at 09:32 PM
Eliezer,
I agree with you what regards people deceiving themselves. But I disagree regarding people that are deceiving others with purpose. Some of these people can be very smart and know very well what they are doing and on what biases they are playing. They have elevated the art of deception to a science, ohhh yes, read marketing books as an example. Otherwise a superintelligence would become stupid in the process of lying to the human operator with the intention to get out of the box.
Posted by: Roland | October 17, 2008 at 09:37 PM
-faith: i.e. unconditional belief is good. It's like loyalty. Questioning beliefs is like betrayal.
-The saying "Stick to your guns.": Changing your mind is like diserting your post in a war. Sticking to a belief is like being a heroic soldier.
-The faithfull: i.e. us, we are the best, god is on our side.
-the infedels: i.e. them, sinners, barely human, or not even.
-God: Infenetly powerful alpha male. Treat him as such with all the implications...
-The devil and his agents: They are always trying to seduce you to sin. Any doubt is evedence the devil is seducing you to sin and suceeding. Anyone opposed to your beliefs is cooperating with/being influenced by the devil.
-Assasination fatwas: Whacking people who are anti-Islam is the will of Allah.
-a sexually satisfying lifestyle is bad: This makes people more angsty(especially young men). This angst is your fault and it's sin. To be less angsty you should be less sinful ergo fight your sexual urges. And so the cycle of desire, guilt, angst and confusion continues.
-no masturbation: see above.
-you are born in debt to Jesus because he died for your sins 2000 years ago.
That's all I could think of right now.
Posted by: PK | October 17, 2008 at 09:41 PM
The endorsement of information cascades: claiming that X is indisputably true in the name of philosophical majoritarianism, and thus biasing research and statements to foster belief in X is desirable as a way to foster true beliefs (where the majority only exists because of such biased efforts).
Posted by: Carl Shulman | October 17, 2008 at 09:43 PM
Just to be clear, I'm not looking for random false beliefs defended by Dark Side epistemology, I'm looking for Dark Side epistemology itself - the Generic Defenses of Fail.
Roland, these are the Sith masters.
Posted by: Eliezer Yudkowsky | October 17, 2008 at 09:45 PM
In general, beliefs require evidence.
In general? Which beliefs don't?
Think of what it would take to deny evolution or heliocentrism
Or what it would take to prove that the Moon doesn't exist.
As for listing common memes that were spawned by the Dark Side - would you care to take a stab at it, dear readers?
Cultural relativity.
Such-and-such is unconstitutional.
The founding fathers never intended... (various appeals to stick to the founding fathers original vision)
Be reasonable (moderate)
Show respect for your elders
It's my private property
_____ is human nature.
Don't judge me.
_____ is unnatural and therefore wrong.
_____ is natural and therefore right.
We need to switch to alternative energies such as wind, solar, and tidal.
The poor are lazy
The entire American political vocabulary (bordering on Orwellian)
Animal rights
.. much more.
Posted by: Peter | October 17, 2008 at 09:58 PM
>Everyone has a right to their own opinion. When you think about it, where was that proverb generated?
In the words of the great sage Emo Phillips, "I used to think that the brain was the most wonderful organ in my body. Then I realized who was telling me this."
Posted by: Dave | October 17, 2008 at 09:59 PM
I thought of some more.
-there is a destiny/Gods plan/reason for everything: i.e. some powerful force is making things the way they are and it all makes sense(in human terms, not cold heartless math). That means you are safe but don't fight the status quo.
-everything is connected with "energy"(mystically): you or special/chosen people might be ably to tap into this "energy". You might glean information you normally shouldn't have or gain some kind of special powers.
-Scientists/professionals/experts are "elitists".
-Mystery is good: It makes life worth while. Appreciating it makes us human. As opposed to destroying it being good.
That's it for now.
Posted by: PK | October 17, 2008 at 10:01 PM
>I'm looking for Dark Side epistemology itself - the Generic Defenses of Fail.
Relax. It will be over soon.
We're past that now.
X is supernatural.
X is natural.
You're correct, but it will make people uncomfortable.
You're smart. You should go to college.
Posted by: Dave | October 17, 2008 at 10:07 PM
I'm pretty confident that ""Everyone has a right to their own opinion." was generated by people trying to protect themselves from people who were trying to protect themselves from the truth.
We really need some talk about what the consequences of an AI with access to its own source code and self-protecting beliefs would be.
Posted by: michael vassar | October 17, 2008 at 10:07 PM
I'm looking for Dark Side epistemology itself - the Generic Defenses of Fail.
In that case - association, essentialism, popularity, the scientific method, magic, and what I'll call Past-ism.
Posted by: Peter | October 17, 2008 at 10:08 PM
We are missing something. Humans are ultimatly driven by emotions. We should look for which emotions beliefs tap into in order to understand why people seek or avoid certain beliefs.
Posted by: PK | October 17, 2008 at 10:28 PM
A particular flavor of "if it ain't broke, don't fix it" that points to established traditions as "having worked for ages". Playing off the fear of the unknown? The meme of traditions in general adds weight to many of these.
I second "cultural relativity" as being an extension of "everyone having a right to their opinion", but in both cases point to them as also being tools to find things in one's own life that *are* arbitrary and in need of evaluation on a more objective basis.
Posted by: outofculture | October 17, 2008 at 11:32 PM
Isn't the scientific method a servant of the Light Side, even if it is occasionally a little misguided?
Posted by: Nominull | October 17, 2008 at 11:35 PM
@Eliezer:
Roland, these are the Sith masters.
Ok, got your point. One thing I worry though is how much those movie analogies end up inducing biases in you and others.
Posted by: Roland | October 17, 2008 at 11:47 PM
@Eliezer:
To drive home my earlier point. The whole idea of jedis vs. siths reflects a Manichaeistic worldview(good vs. bad). Isn't this a simplification?
Posted by: Roland | October 17, 2008 at 11:56 PM
Isn't the scientific method a servant of the Light Side, even if it is occasionally a little misguided?
Too restrictive. Science is not synonymous with the hypothetico-deductive method, and nor is there any sort of thing called the "scientific method" from which scientists draw their authority on a subject. Neither is it a historically accurate description of how science has done its work. Read up on Feyerabend.
Science is inherently structureless and chaotic. It's whatever works.
Posted by: Peter | October 17, 2008 at 11:56 PM
Eliezer writes, "In general, beliefs require evidence."
To which Peter replies, "In general? Which beliefs don't?"
Normative beliefs (beliefs about what should be) don't, IMHO. What would count as evidence for or against a normative belief?
Posted by: Richard Hollerith | October 18, 2008 at 12:20 AM
How about "Comparing Apples and Oranges," or "How Dare you Compare," a misrepresentation of the scope of analogies. For a recent example, see the response to John Lewis's drawing an analogy between certain aspects of the McCain campaign and those of George Wallace -- the response is not a consideration of the scope and aptness of the analogy but a rejection that any analogy at all can be drawn between two subjects when one is so generally recognized to be Evil. The McCain campaign does not attempt to differentiate the aspects under analogy (rhetoric and its potential for the fomentation of violence) from those of Wallace, but rather condemns the idea that the analogy can be considered at all. Under the epistemology of Fail, any difference between two subjects of comparison is enough to reject its validity, regardless the relevance of the distinction to the actual comparison being drawn. See also: Godwin's Law.
Some self-entitled males like to use this one, particularly in defense of the notion that one has in inviolate right to make sexual advances toward other people regardless of circumstance or outward sign. Sooner or later, after demonstrating how each of their justifications also justify sexual assault, it leads to "how dare you compare me to a rapist," which is where the fun begins. After I have done epistemologically belittling them I point out that the obvious fact that sexual assault is known to be bad is a manifestation of general principles of ethical interaction among humans, and not a special case handed down from a God who says that everything that is not expressly forbidden by a law is good.
Posted by: celeriac | October 18, 2008 at 12:25 AM
Animal rights???
You're smart. You should go to college???
Essentialism???
Posted by: O | October 18, 2008 at 12:42 AM
Normative beliefs (beliefs about what should be) don't [require evidence], IMHO. What would count as evidence for or against a normative belief?
That's correct if you don't consider pure reason to be evidence - but I consider it to be so. So morality and ethics and all these normative things are, in fact, based on evidence - although it is a mix of abstract evidence (reason) with concrete evidence (empirical data). If you base your morality, or any normative theory (how the world should be) on anything other than how things actually are (including mathematics), you necessarily have to invoke ascribe some supernatural property onto it
Posted by: Peter | October 18, 2008 at 12:52 AM
One giant category of dark side reasoning looks like "That idea is _____"
Where the idea is an "is" (not a "should") and _____ is any negative affect word with a meaning other than "untrue".
Examples include {unpatriotic, communist, capitalist, liberal, conservative, provincial, any-demonym-goes-here, cultish, religious, atheistic, sinful, evil, dangerous, repugnant, elitist, condescending, out-of-touch, politically incorrect, offensive, argumentative, hateful, cowardly, fool-hardy, inappropriate, indecent, unsettling, lewd, silly, idiotic, new-fangled, old-fashioned, staid, dead, uncool, too simple, too complicated} and many more.
Important note: The exception to this rule is if the speaker could goes on to show how _____ is evidence about the truth of the proposition. If you can say why something is idiotic, that's fine. A seasoned scientist has the right to say "that theory looks too complicated" if the they have many examples of surprisingly simple theories explaining things well, but a creationist doesn't earn the right to accuse the theory of evolution of being "too complicated," until they explain what whatever it is they mean by "too complicated" has to do with the idea being wrong.
To avoid concluding that an idea is true, the Dark Side's first line of defense is to avoid even considering *whether* the idea is true. Those who are good enough at suppressing contradictions can simply save themselves the trouble of building up "a vast complex of anti-law, rules of anti-thought". After all, building such a complex is a risky business from the standpoint of protecting the precious belief. The larger the complex gets, the more close scrapes it could have with real sensory experience.
Just as a murderer ties the corpse of his victim to a heavy stone before throwing it into the water, so too do victims of the Dark Side tie ideas they want to dispose of to negative affect words. It really does make them less likely to resurface.
The same caution applies to tying positive affect words to desired ideas.
Posted by: Marcello | October 18, 2008 at 01:26 AM
Saying 'There is lots of evidence for it' When in fact there is little to none.
I guess the epistemology is 'It is ok to believe something if you believe there is evidence to support it.'
Creationists are told the fossil record supports X and Y, and they run with it.
Posted by: James Andrix | October 18, 2008 at 02:43 AM
The concept of different epistemological magisteria. E gave an example of it in this post (and also in the post about scientists outside the laboratory), but his example is just the tip of the iceberg. This failure of rationality doesn't manifest itself explicitly most of the time, but is engaged in implicitly by almost everybody that I know that isn't into hardcore rationality.
It's definitely engaged in by people who are into, or at least cheer for, science and (traditional) rationality and/or philosophy. It's the double standard between what epistemological standards you explicitly endorse, and what are the actual beliefs on the basis of which you act. Acting as if the sun will rise tomorrow even though you endorse radical scepticism, accepting what Richard Dawkins says on his authority while seeking out refutations for creationist arguments. I think one big reason for this is that people who are interested in this sort of thing are exposed too much to deductive reasoning and hardly at all to rigorous inductive reasoning. Inductive reasoning is the practical form of reasoning that actually works in the real world (many fallacies of deductive reasoning are actually valid probabilistic inferences), and we all have to engage in it explicitly or implicitly to cope in the world. But having been exposed only the "way" of deductive rationality, and warned against it's fallacies, people may come to experience a cognitive dissonance between what epistemological techniques are useful in real life and which epistemological techniques they ought to be using - and therefore to see science, rationality and philosophy as disconnected from real life, things to be cheered for and entertaning diversions. Such people don't hold every part of their epistemological self under the same level of scrutiny, because implicitly they believe that their methods of scrutinizing are imperfect. I recognize my past self in this, but not my present self, who knows about evo psych, inductive reasoning etc. and has seen that these methods actually work and can therefore criticize his own epistemological habits using the full force of his own rationality...
This might concern mistaken, well-meaning people more than the actual Dark Side but it seems to me to be an important point anyway.
Posted by: Bo | October 18, 2008 at 03:44 AM
A few general schemas:
"True for", as in, "That may be true for you, but not for me. We each choose our own truths."
"I feel that X." Every sentence of this form is false, because X is an assertion about the world, not a feeling. Someone saying "I feel that X" in fact believes X, but calling it a feeling instead of a belief protects it from refutation. Try replying "No you don't", and watch the explosion. "How dare you try to tell me what I'm feeling!"
Write obscurely.
Never explicitly state your beliefs. Hint at them in terms that the faithful will pick up and applaud, but which give nothing for the enemy to attack. Attack the enemy by stating their beliefs in terms that the faithful will boo, while giving the enemy nothing to dispute.
Ignore the entire machinery of rationality. Treat all human interaction as nothing more than social grooming or status games in a tribe of apes.
Posted by: Richard Kennaway | October 18, 2008 at 04:13 AM
Daniel: A close second is "don't try to argue with the devil -- he has more experience at it than you".
Would you still disagree with that one if "the devil" was replaced by "a strong AI"?
Posted by: Richard Kennaway | October 18, 2008 at 04:15 AM
How about the notion of an insult as a first-order offence? "Don't insult God/Our Nation/The People/etc.". It is an explicit emotional fortress that reason cannot by definition scale. When it goes near there, all the 'intelligence defeating itself' mechanisms come into play. We take the fortress as our starting argument and start to think backwards until our agitated emotions are satisfied by our half-reasonable but beautiful explanation of why the fortress is safe and why what caused us to doubt it is either not so or can be explained some other way. Ergo, one step deeper into dark epistemology.
Posted by: Alexandros | October 18, 2008 at 06:51 AM
Would you still disagree with that one if "the devil" was replaced by "a strong AI"?
Yes. Suffice it to say I don't think I'd be a very reliable gatekeeper :-).
(Conversely, I don't even think the AI's job in the box experiment is even hard, much less impossible. Last week, I posted a $15 offer to play the AI in a run of the experiment, but my post disappeared somehow.)
Posted by: Daniel Franke | October 18, 2008 at 07:14 AM
I'm in strong agreement with Peter's examples above. I would generalize by saying that the epistemic "dark side" tends to arise whenever there's an implicit discounting of the importance of increasing context. In other words, whenever, for the sake of expediency, "the truth", "the right", "the good". etc., is treated categorically rather than contextually (or equivalently, as if the context were fixed or fully specified.)
Posted by: Jef Allbright | October 18, 2008 at 09:43 AM
See, now there's a prime example of corrupted reasoning right there. Science is carefully structured chaos, ordered according to certain fundamental principles. Meeting those principles is what we mean when we talk about something 'working'.
The recognition of what 'working' is, and the tools that have been found useful in reaching that state, is what constitutes the scientific method.
Scientists do not concern themselves with what philosophers say about science -- it is my experience that they are actively contemptuous of such. Yet science goes on. Strange, isn't it? It's almost as though the philosophers didn't know what they were talking about.
(Additional: the central metaphor of this discussion is flawed - the Light and Dark sides define and require each other; contrastingly, both Jedi and Sith are corruptions and failures to properly represent the two sides of the Force. Accept one, and you reject the truth of things.)
Posted by: Caledonian | October 18, 2008 at 09:47 AM
That was part of my point - that, in this one facet of human endeavor, and in modern times rather than ancient ones, it's remarkable the extent to which an actual Light Side Epistemology and Dark Side Epistemology have developed. Like the sort of contrast that naive people draw between Their Party and the Other Party, only in real life.
Posted by: Eliezer Yudkowsky | October 18, 2008 at 10:06 AM
- There's a huge conspiracy covering it up
- Well, that's just what one of the Bad Guys would say, isn't it?
- Why should I have to justify myself to you?
- Oh, you with your book-learning, you think you're smarter than me?
- They said that to Einstein and Galileo!
- That's a very interesting question, let me show you the entire library that's been written about it (where if there were a satisfactory answer it would be shortish)
- How can you be so sure?
Posted by: Paul Crowley | October 18, 2008 at 12:27 PM
Marcello,
I think your list generalizes too much. I see three main types of words on the list. The first type indicates in-group out-group distinction and seems pretty poisonous to me. The second are ad hominem arguments which are dangerous, but do apply sometimes. And then there are a few like "too complicated." You call those "negative affect words"? Surely it is better to say "that is too complicated to be true" than to say simply "that is not true"?
Posted by: Douglas Knight | October 18, 2008 at 12:32 PM
-You can't prove I'm wrong!
-Well, I'm an optimist.
-Millions of people believe it, how can they all be wrong?
-You're relying too much on cold rationality.
-How can you possibly reduce all the beauty in the world to a bunch of equations?
Posted by: IL | October 18, 2008 at 12:49 PM
Douglas says: """ And then there are a few like "too complicated." You call those "negative affect words"? Surely it is better to say "that is too complicated to be true" than to say simply "that is not true"? """
Well, yes, but that's only when whatever you mean by complicated has something to do with being true. Some people though, just use the phrase "too complicated" just so they can avoid thinking about an idea, and, in that context it really is an empty negative-affect phrase.
Of course, it is better for a scientist to say "that's too complicated to be true" rather than just "that's not true." You're not done by any means once you've made a claim about whether something is true or false; the claim still needs to be backed up. The point was simply that any characterization of an idea is bad unless that characterization really does have something to do with whether the idea is true.
Posted by: Marcello | October 18, 2008 at 12:56 PM
That was part of my point - that, in this one facet of human endeavor, and in modern times rather than ancient ones, it's remarkable the extent to which an actual Light Side Epistemology and Dark Side Epistemology have developed. Like the sort of contrast that naive people draw between Their Party and the Other Party, only in real life.
That sounds a lot more like you're being subject to the same bias.
"Some people have this view, even though reality is more complex, but what's amazing is that in a subject area I care a lot about, that's what's there."
Yes, if you label the things you accept Light, and the things you reject Dark, you'll see that dichotomy, but why that grouping?
Is traditional rationality Light side? or just bayesianism?
The dark side might be more appropriately grouped into a few different schools.
There will be classes of similar rules that contain both light and dark members.
The both sides have always been around, some of the light side rules might be new, and it is new to group the light side together as the things that work best.
But they are not opposed to each other. Just as physics doesn't care if you suffer, logic doesn't care if you get the right answer. There is no battle for our minds. Humans argue about the origin of life, but all existing humans use a combination of light and dark thinking. Creationists can look for evidence and evolutionists can say irrational things for their own psychological defense. The 'sides' coexist quite peacefully, not at all like competing bands of primates.
And this might be a reason that it's so hard to get rid of bad thinking even in ourselves. The light side doesn't have any alarm bell defenses against the dark side.
Posted by: James Andrix | October 18, 2008 at 01:24 PM
"one man's modus ponens in another man's modus tollens."[1][2] is maxim that is easily weaponised by the Dark Side by taking it in a one sided way. One sees ones own implications as proving their consequents and the other sides implications as casting doubt on their antecedents.
Posted by: Alan Crowe | October 18, 2008 at 01:53 PM
If you once tell a lie, the truth is ever after your enemy.
That isn't true.
I've told lies when I was a kid. If I got caught I gave up rather than doing an epistomological attack.
Richard Kennaway: "I feel that X." Every sentence of this form is false, because X is an assertion about the world, not a feeling. Someone saying "I feel that X" in fact believes X, but calling it a feeling instead of a belief protects it from refutation. Try replying "No you don't", and watch the explosion. "How dare you try to tell me what I'm feeling!"
If I say I feel something, I'm talking about an emotion. I don't intend it to be an objective statement about the world, and I'm not offended if someone says it doesn't apply to everyone else.
Posted by: Nancy Lebovitz | October 18, 2008 at 02:20 PM
Nancy Lebovitz: If I say I feel something, I'm talking about an emotion.
That prohibits you from saying "I feel that X". No emotion is spoken of in saying "I feel that the Riemann hypothesis is true", or "I feel that a sequel to The Hobbit should never be made", or "I feel that there is no God but Jaynes and Eliezer (may he live forever) is His prophet", or in any other sentence of that form. "I feel" and "that X" cannot be put together and make a sensible sentence.
If someone finds themselves about to say "I feel that X", they should try saying "I believe that X" instead, and notice how it feels to say that. It will feel different. The difference is fear.
Posted by: Richard Kennaway | October 18, 2008 at 03:16 PM
I believe that there are circumstances in which you can say "I feel that X". What that could rationally mean is that you yourself recognize that you do not have enough evidence or knowledge to justify a belief about X vs. not-X, but that without evidence you lean toward X because you like that alternative. You are admitting ignorance on the subject. Ideally, this would then also imply an openness with regard to forming a belief about X or not-X given some evidence -- that recognition that all you have is a feeling about it means a very weak attachment to the idea of X.
PhilB
Posted by: Phil Boncer | October 18, 2008 at 04:43 PM
Caledonian: What fundamental principles? As far as I can tell the only fundamental principle is that it has to work. But I'm open to counterexamples, if you are.
The recognition of what 'working' is, and the tools that have been found useful in reaching that state, is what constitutes the scientific method.
The scientific method is actually pretty specific - and it is not a set of tools. There is no systematic method of advancing science, no set of rules/tools which are exclusively the means to attaining scientific knowledge.
Scientists do not concern themselves with what philosophers say about science -- it is my experience that they are actively contemptuous of such. . . It's almost as though the philosophers didn't know what they were talking about.
That's actually my point. Scientists do what works, and employ methodological diversity - the "scientific method" is not an actual description of how real scientists do their work, nor how real science has advanced. It's propaganda, made up by certain people who were/are absolutely horrified that science has no defining and fundamental underlying principles - which would throw their entire schema of epistemology into turmoil.
The "rules" of science, if they exist, are subject to change at any time. Science has physical reality at the input and useful models at the output - and no bona fide, tried and true, structure in between.
Posted by: Peter | October 18, 2008 at 04:55 PM
You haven't earned the right to say X.
Posted by: Nick Tarleton | October 18, 2008 at 05:17 PM
You haven't earned the right to say X.
I think that one is poorly-phrased but defensible. You can think of it as short hand for "Your life experiences have provided you with an insufficient collection of Bayesian priors to permit you to assert X with any reasonable certainty".
Posted by: Daniel Franke | October 18, 2008 at 06:07 PM
The worst one is "this is my truth". The ultimate victory of map over territory. In the universe I create, rocks fall up. Forcing me to believe in "gravity" puts you in my proper role as divine map-maker. Your "reason" and "evidence" are just a power grab. I choose not to believe the rock I'm about to drop on my toes will hurt. Ouch! You bastard, you contaminated my purity of self-definition.
Posted by: Julian Morrison | October 18, 2008 at 06:40 PM
"Everyone has a right to their own opinion" is largely a product of its opposite. For a long period many people believed "If my neighbor has a different opinion than I do, then I should kill him". This led to a bad state of affairs and, by force, a less lethal meme took hold.
Posted by: Thom Blake | October 18, 2008 at 10:26 PM
To Richard Kennaway:
Your original point, which I didn't read carefully enough:
"I feel that X." Every sentence of this form is false, because X is an assertion about the world, not a feeling. Someone saying "I feel that X" in fact believes X, but calling it a feeling instead of a belief protects it from refutation. Try replying "No you don't", and watch the explosion. "How dare you try to tell me what I'm feeling!"
"No, you don't" sounds like a chancy move under the circumstances. Have you tried "How sure are you about X?" and if so, what happens?
More generally, statements usually imply more than one claim. If you negate a whole statement, you may think that which underlying claim you're disagreeing with is obvious, but if the person you're talking to thinks you're negating a different claim, it's very easy to end up talking past each other and probably getting angry at each other's obtuseness.
My reply: If I say I feel something, I'm talking about an emotion.
You again: That prohibits you from saying "I feel that X". No emotion is spoken of in saying "I feel that the Riemann hypothesis is true", or "I feel that a sequel to The Hobbit should never be made", or "I feel that there is no God but Jaynes and Eliezer (may he live forever) is His prophet", or in any other sentence of that form. "I feel" and "that X" cannot be put together and make a sensible sentence.
If someone finds themselves about to say "I feel that X", they should try saying "I believe that X" instead, and notice how it feels to say that. It will feel different. The difference is fear."
It sounds to me as though you've run into a community (perhaps representative of the majority of English speakers) with bad habits. I, and the people I prefer to hang out with, would be able to split "I feel that x" into a statement about emotions or intuitions and a statement about the perceived facts which give rise to the emotions or intuitions.
I believe that "I believe that a sequel to The Hobbit should never be made" is emotionally based. Why would someone say such a thing unless they believed that the sequel would be so bad that they'd hate it?
Here's something I wrote recently about the clash between trying to express the feeling that strong emotions indicate the truth and universality of their premises and the fact that real world is more complicated.
Posted by: Nancy Lebovitz | October 19, 2008 at 03:25 AM
"I feel that X" really means, "I believe X, and accept that others will likely disagree." The purpose is to serve as a conversational marker showing that disagreement is expected. When used properly, this is simply to grease the wheels of discourse a bit, making it more likely that the respondent will have the proper idea about the attitude the speaker takes towards the idea, not to imply that the disagreement will be taken as unresolvable. It makes discourse more efficient. Of course, it can be misused in the way that Richard complains about, but I think he's being obtuse to be against the phrase in every manifestation, and especially obtuse in the way he frames his disagreement.
Posted by: pdf23ds | October 19, 2008 at 03:57 AM
I am being forthright, not obtuse. I say again that there is no statement of the form "I feel that X", which would not be rendered more accurate by replacing it with "I believe that X". That people use the word "feel" in this way does not make it a statement about feelings: it remains a statement about beliefs. Neither of those statements actually contains any expression of a feeling about X. Here is one that does: "I am angry that X". Compare "I feel that X" -- what is the feeling? It is not there. In a larger context, the listener may be able to tell, but if they can, they can do so equally well from "I believe that X".
It might well be. But the emotions would not be communicated any better by using the word "feel". They are not communicated at all by either word. (I can think of other reasons why someone might object to a sequel: for example, some people have an ethical objection to fanfiction.)
And no, I've never actually responded to an "I feel that" with a blunt "No you don't". It would rarely help. But I do know people that would call me on it if I ever used the expression, as I would them. A lot of the time -- I am talking about actual, specific experience here, not vague generalisation -- people react emotionally to beliefs they are holding that they have never actually stated out loud as beliefs, and asked "Are these actually true?" Until you have noticed what you believe, you cannot update your beliefs. I-feel-thats avoid that confrontation.
To use "feel", as a couple of people suggested, to mean "tentative belief" changes only the map: there are still no actual feelings being expressed, just a word that has been blurred. This does not grease the wheels of discourse, it gums them up. Better to reserve "feel" for feelings and "believe" for beliefs, for it is a short step from calling them both by the same name to passing them off as the same thing, and then you are on the Dark Side, whether you know it or not. State something as a belief and you open yourself to the glorious possibility of being proved wrong. Call it a feeling and you give yourself a licence to ignore reality.
Posted by: Richard Kennaway | October 19, 2008 at 05:52 AM
Hyperbole as a perversion of projection, arguments like: "...and next you'll be killing AI developers who disagree with FAI, to prevent them posing an existential threat." that contain both sufficient clear reasoning and sufficient unknowable elements as to sound possible, sure, plausible, even. This is used to discredit the original idea, not the fantastical extrapolation.
Posted by: outofculture | October 19, 2008 at 02:12 PM