Showing 325 to 328

Rationalization is irrational

It's a tragedy that these words sound so similar. We have the words "truth" and "lying", totally different vowels and consonants. But we have "rational" and "rationalization". It's as if lying was called "truthization", so that we have "truth" and "truthization" and must constantly remind ourselves that the one is nothing like the other.

Rationalization is impossible. No matter how much you want to, no matter how smart you are, you cannot make rational an idea that wasn't originated in a rational way. You cannot make a lie true. You can only swap out the lie entirely for the truth, just as you can only start over the search for solutions and do the search rationally this time.

A thought experiment, from www.greaterwrong.com/posts/9f5EXt8KNNxTAihtZ/a-rational-argument

You are, by occupation, a campaign manager, and you’ve just been hired by Mortimer Q. Snodgrass, the Green candidate for Mayor of Hadleyburg. As a campaign manager reading a book on rationality, one question lies foremost on your mind: “How can I construct an impeccable rational argument that Mortimer Q. Snodgrass is the best candidate for Mayor of Hadleyburg?”

Sorry. It can’t be done.

“What?” you cry. “But what if I use only valid support to construct my structure of reason? What if every fact I cite is true to the best of my knowledge, and relevant evidence under Bayes’s Rule?”

Sorry. It still can’t be done. You defeated yourself the instant you specified your argument’s conclusion in advance.

This year, the Hadleyburg Trumpet sent out a 16-item questionnaire to all mayoral candidates, with questions like “Can you paint with all the colors of the wind?” and “Did you inhale?” Alas, the Trumpet’s offices are destroyed by a meteorite before publication. It’s a pity, since your own candidate, Mortimer Q. Snodgrass, compares well to his opponents on 15 out of 16 questions. The only sticking point was Question 11, “Are you now, or have you ever been, a supervillain?”

So you are tempted to publish the questionnaire as part of your own campaign literature . . . with the 11th question omitted, of course.

Which crosses the line between rationality and rationalization. It is no longer possible for the voters to condition on the facts alone; they must condition on the additional fact of their presentation, and infer the existence of hidden evidence.

Indeed, you crossed the line at the point where you considered whether the questionnaire was favorable or unfavorable to your candidate, before deciding whether to publish it. “What!” you cry. “A campaign should publish facts unfavorable to their candidate?” But put yourself in the shoes of a voter, still trying to select a candidate—why would you censor useful information? You wouldn’t, if you were genuinely curious. If you were flowing forward from the evidence to an unknown choice of candidate, rather than flowing backward from a fixed candidate to determine the arguments.

[…]

If you really want to present an honest, rational argument for your candidate, in a political campaign, there is only one way to do it:

  • Before anyone hires you, gather up all the evidence you can about the different candidates.
  • Make a checklist which you, yourself, will use to decide which candidate seems best.
  • Process the checklist.
  • Go to the winning candidate.
  • Offer to become their campaign manager.
  • When they ask for campaign literature, print out your checklist.

Only in this way can you offer a rational chain of argument, one whose bottom line was written flowing forward from the lines above it. Whatever actually decides your bottom line is the only thing you can honestly write on the lines above.

Differences between rationalization and rational:

Rationalization Rational
Backward flow Forward flow
   

What links here

Created (2 years ago)

Aumann's agreement theorem

The purpose of my personal wiki is Cruxiness,

No two rationalists can agree to disagree.

EY proposes a norm that allows asking "is that your true rejection?" (be careful though because publicly psychoanalyzing someone can degenerate the conversation fast)

Ideal disagreers ask themselves what is their true rejection – they seek out their cruxes (perhaps with a technique such as Internal Double Crux). Or disagreers can play Double Crux with each other. They have learned to ask themselves how and where they got a belief in the first place and skip arguments that sound good today, and can feel the difference between their true rejection and an argument they came up with just now.

Disagreement can often be traced back to one of the following or other reasons:

  • Uncommon, but well-supported, scientific knowledge or math;
  • Long inferential distances;
  • Hard-to-verbalize intuitions, perhaps stemming from specific visualizations;
  • Zeitgeists inherited from a profession (that may have good reason for it);
  • Patterns perceptually recognized from experience;
  • Sheer habits of thought;
  • Emotional commitments to believing in a particular outcome;
  • Fear that a past mistake could be disproved;
  • Deep self-deception for the sake of pride or other personal benefits.

People may find it embarrassing to talk about some of these in the moment, which is partly why you can't expect to resolve every disagreement in the moment – they also need alone time to sort themselves out. But you can at least ask about some of them, e.g. like this: "is that simple straightforward-sounding reason your true rejection, or does it come from intuition-X or professional-zeigeist-Y?"

www.greaterwrong.com/posts/TGux5Fhcd7GmTfNGC/is-that-your-true-rejection

What links here

  • Is That Your True Rejection?
Created (2 years ago)

Flinch towards your belief's most painful weaknesses

You need to do this spontaneously, as a matter of course. That skill takes some built-up confidence/trust in the fact that your world won't shatter. This can be built-up from

  • repeated experience of genuinely attacking your own beliefs and seeing it only result in good things, repeated experience that changing your beliefs does not end the world, indeed on the contrary, you were glad every time you did.
  • enough experience learning from mistakes that you instinctively know that if you can uncover that you've been making a mistake, a prize awaits you in terms of a great leap ahead.

Normally, we instinctually avoid looking, as if avoiding a red-hot burner: www.greaterwrong.com/posts/dHQkDNMhj692ayx78/avoiding-your-belief-s-real-weak-points?hide-nav-bars=true. It is instinct to only to attack your own beliefs on points you subconsciously identify as safe targets – where some part of you suspects you'll be able to defend the point. Automatically looking for the easiest target for the feel-good of rehearsing your supportive evidence, not automatically looking for the hardest target out of a desire to reveal and destroy false beliefs as if sweeping your body for ticks. Those disgusting ticks.

The core motivation there is different: it is not just that you want to "cover your bases" and feel that you've "done your dues" and done the act of open-minded questioning, as if checking off a box on a list of ritual steps that you're supposed to do to pay obeisance to rationalist propriety. You may even share a desire with the genuinely curious, the desire to bring your beliefs and reality into sync. But this won't work. What's needed is the bone-deep awareness, on top of that natural desire, that if they're out of sync, there is only one way to bring them into sync, and that is by being wrong. Reality is a fixed point, like a mountain; you climb it, it never was an option to make it come to you. The core motivation must be that you want to know what's true, more than that you want your beliefs to be shown true.

This instinctive flinch towards the pain may more likely happen if you put yourself in a certain frame of mind – so, when analyzing a belief of yours or discussing with someone what is true in some topic, recite to yourself some Hallowed phrases like Gendlin's "If the sky is blue, I desire to believe the sky is blue. If the sky is green, I desire to believe the sky is green."

What links here

Created (2 years ago)

"It's raining outside but I don't believe it is"

It's called Moore's paradox when you say "It's raining outside but I don't believe it is".

To be able to honestly make such a statement and believe it, it requires a mind capable of double-think, i.e. a mind that implicitly believes it has the ability to deceive itself, and yet doesn't consciously think of it in those terms, i.e. doesn't explicitly believe it. Fortunately, here is a case of an error you can prevent yourself and others from making ever again just by pointing it out.

A realistic example of such a sentence: someone said "I believe people are nicer than they really are." It's usually the part "really are" that refers to what they literally believe, while the part where someone uses the words "I believe…" refers to something else, it may be a paraphrased endorsement of a way of behaving, or it may be a Belief-in-belief.

Watch for the words "I believe…". You may have the habit of using it to present a simple belief-about-how-the-world-is, but there's a large group of people who don't, and instead communicate such beliefs when they say that something is a certain way, full stop. E.g. they'll say "snow is white" or "God exists", not "I believe snow is white" or "I believe God exists" (religious profession is usually a case of belief-in-belief). When such a person then goes so far as to add "I believe…", there is a different purpose, as explained above.

What links here

  • TODO *Don't Believe You'll Self-Deceive
Created (2 years ago)
Showing 325 to 328