请输入您要查询的百科知识:

 

词条 James–Stein estimator
释义

  1. Setting

  2. The James–Stein estimator

  3. Interpretation

  4. Improvements

  5. Extensions

  6. See also

  7. References

  8. Further reading

{{Short description|Biased estimator for Gaussian random vectors, better than ordinary least-squared-error minimization}}{{technical|date=November 2017}}

The James–Stein estimator is a biased estimator of the mean of Gaussian random vectors. It can be shown that the James–Stein estimator dominates the "ordinary" least squares approach, i.e., it has lower mean squared error. It is the best-known example of Stein's phenomenon.

An earlier version of the estimator was developed by Charles Stein in 1956,[1] and is sometimes referred to as Stein's estimator.{{Citation needed|date=October 2010}} The result was improved by Willard James and Charles Stein in 1961.[2]

Setting

Suppose the vector is the unknown mean of a -variate normally distributed (with known covariance matrix ) random variable :

We are interested in obtaining an estimate of , based on a single observation, , of .

This is an everyday situation in which a set of parameters is measured, and the measurements are corrupted by independent Gaussian noise. Since the noise has zero mean, it is very reasonable to use the measurements themselves as an estimate of the parameters. This is the approach of the least squares estimator, which is .

As a result, there was considerable shock and disbelief when Stein demonstrated that, in terms of mean squared error , this approach is suboptimal.[1] The result became known as Stein's phenomenon.

The James–Stein estimator

If is known, the James–Stein estimator is given by

James and Stein showed that the above estimator dominates for any , meaning that the James–Stein estimator always achieves lower mean squared error (MSE) than the maximum likelihood estimator.[2][3] By definition, this makes the least squares estimator inadmissible when .

Notice that if then this estimator simply takes the natural estimator and shrinks it towards the origin 0. In fact this is not the only direction of shrinkage that works. Let ν be an arbitrary fixed vector of length . Then there exists an estimator of the James-Stein type that shrinks toward ν, namely

The James–Stein estimator dominates the usual estimator for any ν. A natural question to ask is whether the improvement over the usual estimator is independent of the choice of ν. The answer is no. The improvement is small if is large. Thus to get a very great improvement some knowledge of the location of θ is necessary. Of course this is the quantity we are trying to estimate so we don't have this knowledge a priori. But we may have some guess as to what the mean vector is. This can be considered a disadvantage of the estimator: the choice is not objective as it may depend on the beliefs of the researcher.

Interpretation

Seeing the James–Stein estimator as an empirical Bayes method gives some intuition to this result: One assumes that θ itself is a random variable with prior distribution , where A is estimated from the data itself. Estimating A only gives an advantage compared to the maximum-likelihood estimator when the dimension is large enough; hence it does not work for . The James–Stein estimator is a member of a class of Bayesian estimators that dominate the maximum-likelihood estimator.[4]

A consequence of the above discussion is the following counterintuitive result: When three or more unrelated parameters are measured, their total MSE can be reduced by using a combined estimator such as the James–Stein estimator; whereas when each parameter is estimated separately, the least squares (LS) estimator is admissible. A quirky example would be estimating the speed of light, tea consumption in Taiwan, and hog weight in Montana, all together. The James–Stein estimator always improves upon the total MSE, i.e., the sum of the expected errors of each component. Therefore, the total MSE in measuring light speed, tea consumption, and hog weight would improve by using the James–Stein estimator. However, any particular component (such as the speed of light) would improve for some parameter values, and deteriorate for others. Thus, although the James–Stein estimator dominates the LS estimator when three or more parameters are estimated, any single component does not dominate the respective component of the LS estimator.

The conclusion from this hypothetical example is that measurements should be combined if one is interested in minimizing their total MSE. For example, in a telecommunication setting, it is reasonable to combine channel tap measurements in a channel estimation scenario, as the goal is to minimize the total channel estimation error. Conversely, there could be objections to combining channel estimates of different users, since no user would want their channel estimate to deteriorate in order to improve the average network performance.{{Citation needed|date=February 2013}}

The James–Stein estimator has also found use in fundamental quantum theory, where the estimator has been used to improve the theoretical bounds of the entropic uncertainty principle (a recent development of the Heisenberg uncertainty principle) for more than three measurements.[5]

Improvements

The basic James–Stein estimator has the peculiar property that for small values of the multiplier on is actually negative. This can be easily remedied by replacing this multiplier by zero when it is negative. The resulting estimator is called the positive-part James–Stein estimator and is given by

This estimator has a smaller risk than the basic James–Stein estimator. It follows that the basic James–Stein estimator is itself inadmissible.[6]

It turns out, however, that the positive-part estimator is also inadmissible.[3] This follows from a general result which requires admissible estimators to be smooth.

Extensions

The James–Stein estimator may seem at first sight to be a result of some peculiarity of the problem setting. In fact, the estimator exemplifies a very wide-ranging effect, namely, the fact that the "ordinary" or least squares estimator is often inadmissible for simultaneous estimation of several parameters.{{Citation needed|date=February 2012}} This effect has been called Stein's phenomenon, and has been demonstrated for several different problem settings, some of which are briefly outlined below.

  • James and Stein demonstrated that the estimator presented above can still be used when the variance is unknown, by replacing it with the standard estimator of the variance, . The dominance result still holds under the same condition, namely, .[2]
  • The results in this article are for the case when only a single observation vector y is available. For the more general case when vectors are available, the results are similar:{{Citation needed|date=February 2012}}

where is the -length average of the observations.

  • The work of James and Stein has been extended to the case of a general measurement covariance matrix, i.e., where measurements may be statistically dependent and may have differing variances.[11] A similar dominating estimator can be constructed, with a suitably generalized dominance condition. This can be used to construct a linear regression technique which outperforms the standard application of the LS estimator.[7]
  • Stein's result has been extended to a wide class of distributions and loss functions. However, this theory provides only an existence result, in that explicit dominating estimators were not actually exhibited.[8] It is quite difficult to obtain explicit estimators improving upon the usual estimator without specific restrictions on the underlying distributions.[3]

See also

  • Admissible decision rule
  • Hodges' estimator
  • Shrinkage estimator

References

1. ^{{Citation | last = Stein | first = C. | author-link = Charles Stein (statistician) | contribution = Inadmissibility of the usual estimator for the mean of a multivariate distribution | title = Proc. Third Berkeley Symp. Math. Statist. Prob. | year = 1956 | volume = 1 | pages = 197–206 | url = http://projecteuclid.org/euclid.bsmsp/1200501656 | mr = 0084922 | zbl = 0073.35602}}
2. ^{{Citation | last = James | first = W. | last2 = Stein | first2 = C. | author2-link = Charles Stein (statistician) | contribution = Estimation with quadratic loss | title = Proc. Fourth Berkeley Symp. Math. Statist. Prob. | year = 1961 | volume = 1 | pages = 361–379 | url = http://projecteuclid.org/euclid.bsmsp/1200512173 |mr = 0133191}}
3. ^{{Citation | first = E. L. | last = Lehmann | first2 = G. | last2 = Casella | year = 1998 | title = Theory of Point Estimation | edition = 2nd | location = New York | publisher = Springer}}
4. ^{{cite journal | last1 = Efron | first1 = B. | last2 = Morris | first2 = C. | year = 1973 | title = Stein's Estimation Rule and Its Competitors—An Empirical Bayes Approach | journal = Journal of the American Statistical Association | volume = 68 | issue = 341 | pages = 117–130 | publisher = American Statistical Association | doi = 10.2307/2284155}}
5. ^{{Citation | last = Stander | first = M. | title = Using Stein's estimator to correct the bound on the entropic uncertainty principle for more than two measurements | year = 2017 | arxiv = 1702.02440| bibcode = 2017arXiv170202440S}}
6. ^{{Citation | first = T. W. | last = Anderson | year = 1984 | title = An Introduction to Multivariate Statistical Analysis | edition = 2nd | location = New York | publisher = John Wiley & Sons}}
7. ^{{Citation | last = Bock | first = M. E. | year = 1975 | title = Minimax estimators of the mean of a multivariate normal distribution | journal = Annals of Statistics | volume = 3 | issue = 1 | pages = 209–218 | doi = 10.1214/aos/1176343009 | mr = 381064 | zbl = 0314.62005}}
8. ^{{Citation | last = Brown | first = L. D. |authorlink = Lawrence D. Brown | year = 1966 | title = On the admissibility of invariant estimators of one or more location parameters | journal = Annals of Mathematical Statistics | volume = 37 | issue = 5 | pages = 1087–1136 | doi = 10.1214/aoms/1177699259 | mr = 216647 | zbl = 0156.39401}}

Further reading

  • {{cite book |last=Judge |first=George G. |last2=Bock |first2=M. E. |title=The Statistical Implications of Pre-Test and Stein-Rule Estimators in Econometrics |location=New York |publisher=North Holland |year=1978 |isbn=0-7204-0729-X |chapter= |pages=229–257 }}
{{DEFAULTSORT:James-Stein Estimator}}

2 : Estimator|Normal distribution

随便看

 

开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。

 

Copyright © 2023 OENC.NET All Rights Reserved
京ICP备2021023879号 更新时间:2024/9/20 9:36:39