Showing 415 to 418

Bias blind spot

Perhaps the scariest bias.

Potential causes

  • The introspection illusion: you believe that you have direct insight into the origins of your mental states (and ironically, it's plain to you that other people do not know why they do what they do).
    • It could help to have deeply internalized felt senses of
      • how easy it is for two people to misunderstand each other (test: if you don't tend to make the fundamental attribution error, good)
      • that you don't know where you ever get any thought from – that you're not a rational homunculus deep down merely being beset by annoying biases, but rather that those biases are all there is behind your sense of self – that you're running on malicious hardware
      • Level 2 theory of mind
      • that it is useless to be superior – egolessness
    • It could help to fully train away the mind projection fallacy, typical mind fallacy, fundamental attribution error, and all the sort of error that has to do with extrapolating from how you work to how others work, and here the intent is to extrapolate from how others work to how you work.
  • Self-enhancement bias (underlying or same as Dunning-Kruger effect?)
    • Perhaps err on having an impostor effect in this specific matter, so you suspect your introspective ability to be closer to that of a lemur than to most people
      • I am worse at knowing myself than G is at knowing herself, for sure

Other potential patches:

  • For each important decision or conclusion, draw on your encyclopedic knowledge of biases and "what biases are going into this decision?/what biases are involved in this type of decision?" Then correct for each at least somewhat.
    • Correcting for them doesn't mean "oh, I see that I was biased in that way here"… all the research says you won't see it, just having it pointed out as a possibility. It means you shift your conclusion anyway, despite feeling it's already as correct as can be.
    • The problem is that we don't know how much we're personally affected by each bias, especially after education in debiasing, and especially considering some biases may not even exist (bad science), so how do we know how much to correct for them? Maybe only bother to do this in cases when you do have a feedback system, i.e. you will soon be told if your conclusion is wrong. Need concrete examples.
      • Taleb could patch in here … not just if there's a feecback system, maybe also consider if the loss is bounded or unbounded. Need concrete examples.
  • Supposing that the blind spot cannot be removed, maybe we can make the spot smaller? Shrink the space of consequences. Limit the damage.

What links here

Created (3 years ago)

You don't need to know about biases to debias

Telling the subjects of cogsci experiments about a bias rarely helps them – they continue to be just as biased. YES we need Rationality techniques that produce less biased judgment, but NO we don't need to know which biases they circumvent, since the techniques should circumvent them whether or not we know of them.

If that's true, I see only two reasons to learn about biases: it's so you can re-explain them to people, to drive home how much the techniques may help. And if you're developing a new technique, of course you'll need to have a bias in mind.

I understand the research has shown that for most biases, it doesn't help to be told about it or to keep it in mind. However, it seems to me feasible that it could help to dwell more deeply on each bias, at least if you first do as a good student – approach it with many different sorts of questions, think about concrete examples, how it would show up, what sort of policy can prevent the problem, etc – and then develop a TAP (trigger-action plan) and finally train yourself to Notice the trigger. Game, set and match.

Since Knowing About Biases Can Hurt You, it seems worth doing at least this much for biases you do know about. If you're gonna know it, know it well.

What links here

Created (3 years ago)

How to be confused by fiction?

Re. the story of the hot/cool plate that was then rotated (www.greaterwrong.com/posts/fysgqk4CjAwhBgNYT/fake-explanations)

The fix is not to end your faith in your understanding of sequences of events forever… but at least you must be able to come up with out-of-box explanations for strange happenings, up to and including that everything you experienced was a hallucination – actually, it's not about creativity itself (easy), but the habit to activate your creativity as soon as you sense that you can find something that fits reality better than the fait accompli, even if you have to reach far into the improbable. Some disinclination to take what you seem to have observed as something that did happen.

Using a different example, if you never suspected that your science teacher moonlights as a magician, you'd naturally assign a low probability to the hypothesis that she uses sleight of hand to fool you about something. That's fine. The problem is when you don't even think of this hypothesis. You must spot at least the fact that sleight of hand would be one of the easiest ways to cause your current observations, when otherwise your physics model struggles to explain it at anywhere near gears-level.

Inspired by Cicero's cui bono, one way to do it: Zoom out, assume someone meant for you to see this odd result, ask what you would have done to make it happen.

Or: Posed with the challenge to explain the chilly metal plate, ask what you would have done if you right now decided to go into another room with another metal plate and another fire to reproduce the result. Looks like a variant of my trick to find out how I messed up in a relationship: asking "what would I have done different if I re-started the day?" (Cognitive reframings).

Or it's a variant of considering the counterfactual (Reversal test), although here, I see no "traditional" counterfactual – the observation must be taken as given, it makes little sense to consider "what if I had seen a different observation?" (although that might be useful for bug-checking your proposed explanation), but instead you consider the alternative world one where a truth-fairy tells you that some of the facts in your hand are not facts (you've been fed a lie or accidental lie, or someone misdirected your attention or omitted something, or you've misinterpreted something). Having been told this by the fairy, what would be your first thought?

Another way to frame this question: How to feel shocked enough? Practice that to stop the habit of accepting the fait accompli.

What links here

  • Fake Explanations
Created (3 years ago)

How to feel shocked enough?

*Hindsight devalues science, but if it was just that we don't feel as "grateful" to science as we could, that would be a minor problem. Hindsight bias prevents us from noticing that the new scientific finding DOES NOT fit what we would have expected, and so we cannot debug why our world-model would have guessed differently.

This debugging is desirable, but pause here for a digression. Assume now that our priors were at actually 1:1 or better in favour of observing the finding. Even then (or maybe especially then), we're never as shocked as we could be (hindsight devalues science: "of course students perform better after a morning walk, all these researchers proving what we already knew, what a waste of dollars…"). Being simply told a fact, we need to put in conscious effort to be shocked enough.

It would be better if everyone always led with a question to let you guess first which way the fact will go, and only then gave you the fact (Ask people to guess before telling them a fact).

My idea to feel shocked enough: REVERSE ALL NEW FACTS.

(Looks like yet another type of Reversal test!)

It goes something like this… you reverse it and ask if you would've bought this with the same credulity, or if it would've been surprising. If surprising, that's good.

Sometimes there is no direct reverse as there were many possible answers. But you go back and ask what you would've guessed. (Maybe that's the proper name of the technique, not reversing the fact, but "what would've you guessed?")

It's really hard to know what you would've guessed, and that's why it's so much better to make a guess before hearing the fact.

Apply the technique to new science "findings" as a training step only – findings are easy to be skeptical of, thus not important. Where you most need your art as a rationalist is where it's most difficult. The goal is to reverse things that never occur to you as targets of doubt.

It takes effort to disbelieve, so probably good to have the habit of trying to disprove what you hear, and that starts with checking what you would've guessed. At least, when people tell you something new, you can ask "is that so?" to promote to consciousness the fact that you just heard and therefore believed something. Similar to the TAP action "I notice that I'm confused", but here the function to you is simply "I notice that I just received unverified data".

For bonus points, say "Is that so? I wouldn't have guessed that" or "Is that so? I would've guessed X instead."

I'd like to run this mental operation often so it's effortless. It actually creates interesting daydreams too: when a coworker says his car broke down, you stop and wonder: "huh. I wonder if I could've expected that" and start looking back for signs. Productive daydreams that refine your world-model.

What links here

Created (3 years ago)
Showing 415 to 418