History of statistics

History of statistics

See Eugenics

History of Bayesian methods

Reverend Thomas Bayes was inspired by David Hume's idea that you can only base knowledge on experience. In 1763 (posthumously) his paper was published explaining the basic theorem of how to update a preexisting guess based on evidence.

Pierre-Simon Laplace picked it up, and nine years later (in 1774) he presented the fundamental principles of Bayesian probability theory. He tested it by working extensively on demography. Seven years later (in 1781) he gave Bayes' Theorem its current form. Some say Bayesian statistics may be more rightly called Laplacian statistics.

After Laplace, the concept of probability as a measure of personal uncertainty fell out of favor for a century (why?). Karl Pearson and Francis Ysidro Edgeworth revisited it for a bit, but these names are largely associated with frequentism.

Bruno de Finetti (1906–1985) dedicated his life to the idea of subjective probability. He began his book on probability theory with the declaration: "PROBABILITY DOES NOT EXIST". A famous paper in 1930 and another in 1974 – in the latter, he wrote: My thesis, paradoxically, and a little provocatively, but nonetheless genuinely, is simply this: Probability does not exist. The abandonment of superstitious beliefs about the existence of the Phlogiston, the Cosmic Ether, Absolute Space and Time, … or Fairies and Witches was an essential step along the road to scientific thinking. Probability, too, if regarded as something endowed with some kind of objective existence, is no less a misleading misconception, an illusory attempt to exteriorize or materialize our true probabilistic beliefs.

Independently from Bruno de Finetti, Frank Ramsey arrived to the same conclusions (1931). Harold Jeffrey focused on "objective probability" while maintaining a Bayesian framework (1939).

Many applied Bayesian methods were developed during the Second World War, notably by Alan Turing. Also, Stanislaw Ulam discovered Monte Carlo simulation whereupon John von Neumann programmed the ENIAC (the first computer!) to do it.

After the Second World War, we saw a so-called neo-Bayesianism, a revival that emphasized decision theory. That's to say it developed the use of probability theory to guide your reasoning about life in general. Today it underlies the normative prescription of rational behavior from which irrational people depart: psychologists and cognitive scientists measure irrationality as departures from Bayesian reasoning. A big figure here is Leonard Savage, who synthesized the work of Bruno de Finetti and Frank Ramsey in 1954.

Dennis Lindley was visiting the university in Chicago where Leonard Savage worked and got inspired by meeting him. In a 1957 paper, he discussed how the frequentist ("conventional") approach and the Bayesian approach can disagree regarding hypothesis testing, something now called Lindley's paradox – not really a paradox.

In his 2006 book, Understanding Uncertainty, he wrote Uncertainty is a personal matter; it is not the uncertainty but your uncertainty.

Since Bayesian methods were calculation-heavy and impractical, they were rarely used (though John Tukey proposed Bayesian methodology to predict the 1962 US elections) up untitl the 90s.

From the 90s on, we have had two things going for us: powerful computers as well as relatively efficient algorithms like Markov chain Monte Carlo (MCMC). Now it is finally possible for all researchers to use Bayesian modeling.

What links here

Created (3 years ago)