词条 | Shrinkage estimator |
释义 |
ExamplesOne general result is that many standard estimators can be improved, in terms of mean squared error (MSE), by shrinking them towards zero (or any other fixed constant value). Assume that the expected value of the raw estimate is not zero and consider other estimators obtained by multiplying the raw estimate by a certain parameter. A value for this parameter can be specified so as to minimize the MSE of the new estimate. For this value of the parameter, the new estimate will have a smaller MSE than the raw one. Thus it has been improved. An effect here may be to convert an unbiased raw estimate to an improved biased one. A well-known example arises in the estimation of the population variance by sample variance. For a sample size of n, the use of a divisor n − 1 in the usual formula (Bessel's correction) gives an unbiased estimator, while other divisors have lower MSE, at the expense of bias. The optimal choice of divisor (weighting of shrinkage) depends on the excess kurtosis of the population, as discussed at mean squared error: variance, but one can always do better (in terms of MSE) than the unbiased estimator; for the normal distribution a divisor of n + 1 gives one which has the minimum mean square error. BackgroundShrinkage is implicit in Bayesian inference and penalized likelihood inference, and explicit in James–Stein-type inference. In contrast, simple types of maximum-likelihood and least-squares estimation procedures do not include shrinkage effects, although they can be used within shrinkage estimation schemes. ApplicationsCopasThe use of shrinkage estimators in the context of regression analysis, where there may be a large number of explanatory variables, has been described by Copas.[1] Here the values of the estimated regression coefficients are shrunk towards zero with the effect of reducing the mean square error of predicted values from the model when applied to new data. A later paper by Copas[2] applies shrinkage in a context where the problem is to predict a binary response on the basis of binary explanatory variables. Hausser and StrimmerHausser and Strimmer "develop a James-Stein-type shrinkage estimator, resulting in a procedure that is highly efficient statistically as well as computationally. Despite its simplicity, ...it outperforms eight other entropy estimation procedures across a diverse range of sampling scenarios and data-generating models, even in cases of severe undersampling. ...method is fully analytic and hence computationally inexpensive. Moreover, ...procedure simultaneously provides estimates of the entropy and of the cell frequencies. ...The proposed shrinkage estimators of entropy and mutual information, as well as all other investigated entropy estimators, have been implemented in R (R Development Core Team, 2008). A corresponding R package “entropy” was deposited in the R archive CRAN and is accessible at the URL https://cran.r-project.org/web/packages/entropy/ under the GNU General Public License." [3] See also
References1. ^{{cite journal |last=Copas |first=J.B. |year=1983 |title=Regression, Prediction and Shrinkage |journal=Journal of the Royal Statistical Society, Series B |volume=45 |issue=3 |pages=311–354 |mr=737642 | jstor = 2345402}} 2. ^{{cite journal |last=Copas |first=J.B. |year=1993 |title=The shrinkage of point scoring methods |journal=Journal of the Royal Statistical Society, Series C |volume=42 |issue=2 |pages=315–331 |jstor=2986235}} 3. ^{{cite journal|last=Hausser|first=Jean|author2=Strimmer |title=Entropy Inference and the James-Stein Estimator, with Application to Nonlinear Gene Association Networks|journal=Journal of Machine Learning Research|year=2009|volume=10|pages=1469–1484|url=http://jmlr.csail.mit.edu/papers/volume10/hausser09a/hausser09a.pdf|accessdate=2013-03-23}} Statistical Software{{cite web|last=Hausser|first=Jean|title=entropy|url=https://cran.r-project.org/web/packages/entropy/|work=entropy package for R|accessdate=2013-03-23}} 1 : Estimator |
随便看 |
|
开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。