MSE-improvement of the least squares estimator by dropping variables |
| |
Authors: | Erkki P. Liski Götz Trenkler |
| |
Affiliation: | (1) University of Tampere, P.O. Box 607, SF-33101 Tampere, Finland;(2) University of Dortmund, P.O. Box 50 05 00, 4600 Dortmund 50, Germany |
| |
Abstract: | It is well known that dropping variables in regression analysis decreases the variance of the least squares (LS) estimator of the remaining parameters. However, after elimination estimates of these parameters are biased, if the full model is correct. In his recent paper, Boscher (1991) showed that the LS-estimator in the special case of a mean shift model (cf. Cook and Weisberg, 1982) which assumes no “outliers” can be considered in the framework of a linear regression model where some variables are deleted. He derived conditions under which this estimator outperforms the LS-estimator of the full model in terms of the mean squared error (MSE)-matrix criterion. We demonstrate that this approach can be extended to the general set-up of dropping variables. Necessary and sufficient conditions for the MSE-matrix superiority of the LS-estimator in the reduced model over that in the full model are derived. We also provide a uniformly most powerful F-statistic for testing the MSE-improvement. |
| |
Keywords: | Dropping variables Mean squared error superiority least squares estimator F-test |
本文献已被 SpringerLink 等数据库收录! |
|