# Log-posterior

What's a log-posterior (all the `lp__`

output from Stan)? It's a number piled on by every variable in the model, and the higher it is, the more likely that the model is good.

Break it down. First, it's similar to the frequentist log-likelihood (Likelihood function), and the log transformation is a convenience thing:

The log likelihood tells you nothing you can't get from the likelihood, but if observations are independent, it is additive. That's often an advantage when you want to differentiate to find a maximum. The reason for logging a posterior is because it is derived partly from the likelihood which, as mentioned above, is additive.

When one compares two models in a Bayesian setup, one can take the ratio of the posteriors. This can be interpreted as the odds for one of the models over the other. If we take the log of the ratio, we get the difference of the log-posteriors. Thus, the log-posterior can be used in model comparison.

In comparison to Frequentist methods, this is not unlike the likelihood ratio tests for model comparisons. The advantages in the Bayesian setting are two: 1) our models do not have to be nested, as they do in the likelihood ratio test; 2) the distribution of the likelihood ratio test is only known asymptotically; the difference in the log-posteriors gives us the distribution for whatever sample size we have.