Showing 416 to 419

Think up two reasons how your starting judgment could be wrong

This technique helps against a range of biases, at least including:

  • overconfidence
  • hindsight bias
  • anchoring

That many biases reduced, if you just think up some reasons that your initial judgment might be incorrect.

Although don't dwell too long:

After a certain point, it becomes increasingly difficult for a person to generate reasons they might have been incorrect. This then serves to convince them that their idea must be right, otherwise it would be easier to come up with reasons against the claim. At this point, the technique ceases to have a debiasing effect. While the exact number of reasons that one should consider is likely to differ from case to case, Sanna et. al. (2002) found a debiasing effect when subjects were asked to consider 2 reasons against their initial conclusion but not when they were asked to consider 10. Consequently, it seems plausible that the ideal number of arguments to consider will be closer to 2 than 10.

So consider the opposite but not too much.

What links here

Created (2 years ago)

Bias blind spot

Perhaps the scariest bias.

Potential causes

  • The introspection illusion: you believe that you have direct insight into the origins of your mental states (and ironically, it's plain to you that other people do not know why they do what they do).
    • It could help to have deeply internalized felt senses of
      • how easy it is for two people to misunderstand each other (test: if you don't tend to make the fundamental attribution error, good)
      • that you don't know where you ever get any thought from – that you're not a rational homunculus deep down merely being beset by annoying biases, but rather that those biases are all there is behind your sense of self – that you're running on malicious hardware
      • Level 2 theory of mind
      • that it is useless to be superior – egolessness
    • It could help to fully train away the mind projection fallacy, typical mind fallacy, fundamental attribution error, and all the sort of error that has to do with extrapolating from how you work to how others work, and here the intent is to extrapolate from how others work to how you work.
  • Self-enhancement bias (underlying or same as Dunning-Kruger effect?)
    • Perhaps err on having an impostor effect in this specific matter, so you suspect your introspective ability to be closer to that of a lemur than to most people
      • I am worse at knowing myself than G is at knowing herself, for sure

Other potential patches:

  • For each important decision or conclusion, draw on your encyclopedic knowledge of biases and "what biases are going into this decision?/what biases are involved in this type of decision?" Then correct for each at least somewhat.
    • Correcting for them doesn't mean "oh, I see that I was biased in that way here"… all the research says you won't see it, just having it pointed out as a possibility. It means you shift your conclusion anyway, despite feeling it's already as correct as can be.
    • The problem is that we don't know how much we're personally affected by each bias, especially after education in debiasing, and especially considering some biases may not even exist (bad science), so how do we know how much to correct for them? Maybe only bother to do this in cases when you do have a feedback system, i.e. you will soon be told if your conclusion is wrong. Need concrete examples.
      • Taleb could patch in here … not just if there's a feecback system, maybe also consider if the loss is bounded or unbounded. Need concrete examples.
  • Supposing that the blind spot cannot be removed, maybe we can make the spot smaller? Shrink the space of consequences. Limit the damage.

What links here

Created (2 years ago)

You don't need to know about biases to debias

Telling the subjects of cogsci experiments about a bias rarely helps them – they continue to be just as biased. YES we need Rationality techniques that produce less biased judgment, but NO we don't need to know which biases they circumvent, since the techniques should circumvent them whether or not we know of them.

If that's true, I see only two reasons to learn about biases: it's so you can re-explain them to people, to drive home how much the techniques may help. And if you're developing a new technique, of course you'll need to have a bias in mind.

I understand the research has shown that for most biases, it doesn't help to be told about it or to keep it in mind. However, it seems to me feasible that it could help to dwell more deeply on each bias, at least if you first do as a good student – approach it with many different sorts of questions, think about concrete examples, how it would show up, what sort of policy can prevent the problem, etc – and then develop a TAP (trigger-action plan) and finally train yourself to Notice the trigger. Game, set and match.

Since Knowing About Biases Can Hurt You, it seems worth doing at least this much for biases you do know about. If you're gonna know it, know it well.

What links here

Created (2 years ago)

How to be confused by fiction?

Re. the story of the hot/cool plate that was then rotated (www.greaterwrong.com/posts/fysgqk4CjAwhBgNYT/fake-explanations)

The fix is not to end your faith in your understanding of sequences of events forever… but at least you must be able to come up with out-of-box explanations for strange happenings, up to and including that everything you experienced was a hallucination – actually, it's not about creativity itself (easy), but the habit to activate your creativity as soon as you sense that you can find something that fits reality better than the fait accompli, even if you have to reach far into the improbable. Some disinclination to take what you seem to have observed as something that did happen.

Using a different example, if you never suspected that your science teacher moonlights as a magician, you'd naturally assign a low probability to the hypothesis that she uses sleight of hand to fool you about something. That's fine. The problem is when you don't even think of this hypothesis. You must spot at least the fact that sleight of hand would be one of the easiest ways to cause your current observations, when otherwise your physics model struggles to explain it at anywhere near gears-level.

Inspired by Cicero's cui bono, one way to do it: Zoom out, assume someone meant for you to see this odd result, ask what you would have done to make it happen.

Or: Posed with the challenge to explain the chilly metal plate, ask what you would have done if you right now decided to go into another room with another metal plate and another fire to reproduce the result. Looks like a variant of my trick to find out how I messed up in a relationship: asking "what would I have done different if I re-started the day?" (Cognitive reframings).

Or it's a variant of considering the counterfactual (Reversal test), although here, I see no "traditional" counterfactual – the observation must be taken as given, it makes little sense to consider "what if I had seen a different observation?" (although that might be useful for bug-checking your proposed explanation), but instead you consider the alternative world one where a truth-fairy tells you that some of the facts in your hand are not facts (you've been fed a lie or accidental lie, or someone misdirected your attention or omitted something, or you've misinterpreted something). Having been told this by the fairy, what would be your first thought?

Another way to frame this question: How to feel shocked enough? Practice that to stop the habit of accepting the fait accompli.

What links here

  • Fake Explanations
Created (2 years ago)
Showing 416 to 419