估计函数
维基百科,自由的百科全书
在统计学中,估计函数是一个关于已知样本的函数,它用来估计未知总体的参数,它有时也被称为估计子;一次 估计 是指把这个函数应用在一组已知的数据集上,求函数的结果.对于给定的参数,可以有许多不同的估计函数。我们通过一些选择标准从它们中选出较好的估计函数,但是有时候很难说选择这一个估计子比另外一个好。
To estimate a parameter of interest (e.g., a population mean, a binomial proportion, a difference between two population means, or a ratio of two population standard deviation), the usual procedure is as follows:
1- Select a random sample from the population of interest.
2- Calculate the point estimate of the parameter.
3- Calculate a measure of its variability, often a confidence interval.
4- Associate with this estimate a measure of variability.
There are two types of estimators: point estimators and interval estimators.
目录 |
[编辑] 点估计函数
参数θ的点估计函数 具有一下性质:
- The error of is
- The bias of is defined as
- is an unbiased estimator of θ iff for all θ, or, equivalently, iff for all θ.
- The mean squared error of is defined as
- i.e. mean squared error = variance + square of bias.
where var(X) is the variance of X and E(X) is the expected value of X.
The standard deviation of an estimator of θ (the square root of the variance), or an estimate of the standard deviation of an estimator of θ, is called the standard error of θ.
[编辑] 一致性
A consistent estimator is an estimator that converges in probability to the quantity being estimated as the sample size grows.
An estimator tn (where n is the sample size) is a consistent estimator for parameter θ if and only if, for all ε > 0, no matter how small, we have
It is called strongly consistent, if it converges almost surely to the true value.
[编辑] Efficiency
The quality of an estimator is generally judged by its mean squared error.
However, occasionally one chooses the unbiased estimator with the lowest variance. Efficient estimators are those that have the lowest possible variance among all unbiased estimators. In some cases, a biased estimator may have a uniformly smaller mean squared error than does any unbiased estimator, so one should not make too much of this concept. For that and other reasons, it is sometimes preferable not to limit oneself to unbiased estimators; see bias (statistics). Concerning such "best unbiased estimators", see also Cramér-Rao inequality, Gauss-Markov theorem, Lehmann-Scheffé theorem, Rao-Blackwell theorem.
[编辑] Other properties
Often, estimator are due to restrictions (restricted estimators).
[编辑] 参见
- 最大似然估计
- Method of moments, generalized method of moments
- Cramér-Rao inequality
- Minimum mean squared error (MMSE)
- Maximum a posteriori (MAP)
- Minimum variance unbiased estimator (MVUE)
- Best linear unbiased estimator (BLUE)
- Unbiased estimators — see bias (statistics).
- Particle filter
- Markov chain Monte Carlo (MCMC)
- 卡尔曼滤波
- 维纳滤波