Tests for Differences between Least Squares and Robust Regression Parameter Estimates and Related Topics
Maravina, Tatiana A.
MetadataShow full item record
At the present time there is no well accepted test for comparing least squares and robust linear regression coefficient estimates. To fill this gap we propose and demonstrate the efficacy of two Wald-like statistical tests for the above purposes, using for robust regression the class of MM-estimators. The tests are designed to detect significant differences between least squares and robust estimates due to both inefficiency of least squares under fat-tailed non-normality and significantly larger biases of least squares relative to robust regression coefficient estimators under bias inducing distributions. The asymptotic normality of the test statistics is established and the finite sample level and power of the tests are evaluated by Monte Carlo, with the latter yielding promising results. The first part of our research focuses on the LS and robust regression slope estimators, both of which are consistent under skewed error distributions. A second part of the research focuses on intercept estimation, in which case there is a need to adjust for some bias in the robust MM-intercept estimator under skewed error distributions. An interesting by-product of our research is that use of the slowly re-descending Tukey bisquare loss function leads to better test performance than the rapidly re-descending min-max bias optimal loss function.
- Statistics