Akaike Information Criterion
Reasons for information criteria can be found in a book Information Theory and Statistics by Kullback. Explains well.
True model yi = 1 + x0.1 - x2 0.2 …
Various models (hundreds, thousands) ∑ …
Choose model by best (smallest) AIC/BIC/DIC/WAIC.
AIC = D train + 2p
AIC is an approximation that is reliable only when: (1) The priors are flat or overwhelmed by the likelihood. (2) The posterior distribution is approximately multivariate Gaussian. (3) The sample size N is much greater than the number of parameters k.
Watanabe-Akaike Information Criterion
Like AIC, you can rank models by WAIC. But a more interpretable measure is an Akaike weight. The weight for a model i in a set of m models is given by
node:internal/modules/cjs/loader:1228
throw err;
^
Error: Cannot find module 'katex'
Require stack:
- /home/kept/private-dotfiles/.config/emacs/texToMathML.js
at Module._resolveFilename (node:internal/modules/cjs/loader:1225:15)
at Module._load (node:internal/modules/cjs/loader:1051:27)
at Module.require (node:internal/modules/cjs/loader:1311:19)
at require (node:internal/modules/helpers:179:18)
at Object.
where dWAIC is the difference between each WAIC and the lowest WAIC, i.e. dWAIC = WAICi - WAICmin.
Leave-one-out cross-validation (LOO-CV)
New kid on the block, around 2020 it was the best (for which situations?).