Anti-epistemology

Being able to find outs for a theory that doesn't fit evidence is anti-knowledge, and the more practice you get at it the crazier you become.

sources

Anti-epistemology refers to bad explicit beliefs about rules of reasoning, usually developed in the course of protecting an existing false belief—false beliefs are opposed not only by true beliefs (that must then be obscured in turn), but also by good rules of systematic reasoning (which must then be denied).

From Yudkowsky's 2008 post www.greaterwrong.com/posts/XTWkjCJScy2GFAgDt/dark-side-epistemology :

A single lie you tell yourself may seem plausible enough, when you don’t know any of the rules governing thoughts, or even that there are rules; and the choice seems as arbitrary as choosing a flavor of ice cream, as isolated as a pebble on the shore . . .

. . . but then someone calls you on your belief, using the rules of reasoning that they’ve learned. They say, “Where’s your evidence?”

And you say, “What? Why do I need evidence?”

So they say, “In general, beliefs require evidence.”

This argument, clearly, is a soldier fighting on the other side, which you must defeat. So you say: “I disagree! Not all beliefs require evidence. In particular, beliefs about dragons don’t require evidence. When it comes to dragons, you’re allowed to believe anything you like. So I don’t need evidence to believe there’s a dragon in my garage.”

And the one says, “Eh? You can’t just exclude dragons like that. There’s a reason for the rule that beliefs require evidence. To draw a correct map of the city, you have to walk through the streets and make lines on paper that correspond to what you see. That’s not an arbitrary legal requirement—if you sit in your living room and draw lines on the paper at random, the map’s going to be wrong. With extremely high probability. That’s as true of a map of a dragon as it is of anything.”

So now this, the explanation of why beliefs require evidence, is also an opposing soldier. So you say: “Wrong with extremely high probability? Then there’s still a chance, right? I don’t have to believe if it’s not absolutely certain.”

Or maybe you even begin to suspect, yourself, that “beliefs require evidence.” But this threatens a lie you hold precious; so you reject the dawn inside you, push the Sun back under the horizon.

Or you’ve previously heard the proverb “beliefs require evidence,” and it sounded wise enough, and you endorsed it in public. But it never quite occurred to you, until someone else brought it to your attention, that this proverb could apply to your belief that there’s a dragon in your garage. So you think fast and say, “The dragon is in a separate magisterium.”

Having false beliefs isn’t a good thing, but it doesn’t have to be permanently crippling—if, when you discover your mistake, you get over it. The dangerous thing is to have a false belief that you believe should be protected as a belief—a belief-in-belief, whether or not accompanied by actual belief.

A single Lie That Must Be Protected can block someone’s progress into advanced rationality. No, it’s not harmless fun.

Just as the world itself is more tangled by far than it appears on the surface, so too there are stricter rules of reasoning, constraining belief more strongly, than the untrained would suspect. The world is woven tightly, governed by general laws, and so are rational beliefs.

Think of what it would take to deny evolution or heliocentrism—all the connected truths and governing laws you wouldn’t be allowed to know. Then you can imagine how a single act of self-deception can block off the whole meta level of truth-seeking, once your mind begins to be threatened by seeing the connections.

Topoi

  • Rules of reasoning
  • Protecting a false belief
  • A false belief opposed by true beliefs, which must be obscured
  • Those true beliefs are supported by good rules of systematic reasoning, which must then be denied

Logical rudeness

kinda related?

As mentioned previously, some Christians tell atheists that atheists know there's a God really and are just being atheists to annoy, because they know it teases. Some atheists tell religious people that theists won't accept atheistic arguments because they're afraid of death, or too immersed in the church community to bear the social cost of leaving. In a conversation about race or gender, it won't be long before someone claims another person's view is held because of their privilege. And so on.

Suber calls this rude rather than fallacious because it is possible for people who hold true beliefs to be "rude" in this way (and in fact, rejecting arguments because they come from rude people is itself rude). Rather, rudeness violates the norms for debate, […]

… offenses against the cooperative flow of debate, which might be "logically rude" even if spoken politely; for example, saying "X because Y", and then, after one side went to a great deal of trouble to test and falsify Y, saying, "Well, Y doesn't really matter, really X because Z". Similarly, ignoring all the diligent work that evolutionary biologists did to dig up previous fossils, and insisting you can only be satisfied by an actual videotape, is "logically rude" because you're ignoring evidence that someone went to a great deal of trouble to provide to you.

other resources

Dark Side memes

  • "Everyone's entitled to their opinion"
  • "I can define a word any way I like"
  • That all "truths" stand on equal ground
    • the word you want to use is beliefs, not truths: trueness is just the correspondence between a belief and reality. if you have two incompatible truths, at least one does not correspond to this reality. but you can talk this way about beliefs, values etc.
  • That there is no single truth
  • That the truth is impossible to get at anyway

Privileging the Hypothesis

What links here

Created (2 years ago)