Estimator

#statistics

An estimator is a rule for estimating a given quantity based on observed data.

Example estimators: error, MSE, sampling deviation, bias, best linear unbiased estimator (BLUE), minimum-variance unbiased estimator (MVUE), maximum a posteriori (MAP).

A consistent estimator, AKA asymptotically consistent estimator, is one which converges to the correct value even if it has bias initially, with enough samples.

A biased estimator is one where the expected value and true value are different.

If an unbiased estimator converges on any value, it means the estimator must be consistent. In other words, there is no such thing as an unbiased inconsistent estimator, but there are unbiased estimators that do not converge. Schoolbook examples of this are such dead-stupid estimators as x = Xn, which never takes the whole array X1,2,3… into account, only the latest observation and naturally that does not converge to any kind of average, but it unbiasedly estimates the latest observation.

BLUE

Created (2 years ago)