Sensitivity analysis

Bayesian methods

I've followed this approach in much of my own applied work, using noninformative priors and carefully avoiding the use of prior information in the final stages a statistical analysis. But that can't always be the right choice. Sometimes (as in the sex ratio example above), the data are just too weak—and a classical textbook data analysis can be misleading. Imagine a Venn diagram, where one circle is "Topics that are so controversial that we want to avoid using prior information in the statistical analysis" and the other circle is "Problems where the data are weak compared to prior information." If you're in the intersection of these circles, you have to make some tough choices! More generally, there is a Bayesian solution to the problem of sensitivity to prior assumptions. That solution is sensitivity analysis: perform several analyses using different reasonable priors. Make more explicit the mapping from prior and data to conclusions. Be open about sensitivity, don't try to sweep the problem under the rug, etc etc. And, if you're going that route, I'd also like to see some analysis of sensitivity to assumptions that are not conventionally classified as "prior." You know, those assumptions that get thrown in because they're what everybody does. For example, Cox regression is great, but additivity is a prior assumption too! (One might argue that assumptions such as additivity, logistic links, etc., are exempt from Fisher's strictures by virtue of being default assumptions rather than being based on prior information—but I certainly don't think Mayo would take that position, given her strong feelings on Bayesian default priors.)

–Gelman

What links here

Created (3 years ago)