请输入您要查询的百科知识:

 

词条 Residual sum of squares
释义

  1. One explanatory variable

  2. Matrix expression for the OLS residual sum of squares

  3. Relation with Pearson's product-moment correlation

  4. See also

  5. References

{{more citations needed|date=April 2013}}

In statistics, the residual sum of squares (RSS), also known as the sum of squared residuals (SSR) or the sum of squared errors of prediction (SSE), is the sum of the squares of residuals (deviations predicted from actual empirical values of data). It is a measure of the discrepancy between the data and an estimation model. A small RSS indicates a tight fit of the model to the data. It is used as an optimality criterion in parameter selection and model selection.

In general, total sum of squares = explained sum of squares + residual sum of squares. For a proof of this in the multivariate ordinary least squares (OLS) case, see partitioning in the general OLS model.

One explanatory variable

In a model with a single explanatory variable, RSS is given by:

where yi is the i th value of the variable to be predicted, xi is the i th value of the explanatory variable, and is the predicted value of yi (also termed ).

In a standard linear simple regression model, , where a and b are coefficients, y and x are the regressand and the regressor, respectively, and ε is the error term. The sum of squares of residuals is the sum of squares of estimates of εi; that is

where is the estimated value of the constant term and is the estimated value of the slope coefficient b.

Matrix expression for the OLS residual sum of squares

The general regression model with 'n' observations and 'k' explanators, the first of which is a constant unit vector whose coefficient is the regression intercept, is

where y is an n × 1 vector of dependent variable observations, each column of the n × k matrix X is a vector of observations on one of the k explanators, is a k × 1 vector of true coefficients, and e is an n× 1 vector of the true underlying errors. The ordinary least squares estimator for is

The residual vector = , so the residual sum of squares is:

,

{{anchor|Norm of residuals}}(equivalent to the square of the norm of residuals); in full:

,

where H is the hat matrix, or the projection matrix in linear regression.

Relation with Pearson's product-moment correlation

The least-squares regression line is given by

,

where and , where and

Therefore,

where

The Pearson product-moment correlation is given by therefore,

See also

  • Sum of squares (statistics)
  • Squared deviations
  • Errors and residuals in statistics
  • Lack-of-fit sum of squares
  • Degrees of freedom (statistics)#Sum of squares and degrees of freedom
  • Chi-squared distribution#Applications
  • Mean squared error

References

  • {{cite book

|title = Applied Regression Analysis
|edition = 3rd
|last1= Draper |first1=N.R. |last2=Smith |first2=H.
|publisher = John Wiley
|year = 1998
|isbn = 0-471-17082-8}}

2 : Least squares|Errors and residuals

随便看

 

开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。

 

Copyright © 2023 OENC.NET All Rights Reserved
京ICP备2021023879号 更新时间:2024/11/13 9:33:57