Seminars

Variable selection in linear regression through adaptive penalty selection

64
reads

Chuan-Fa Tang

2011-01-07
12:45:00 - 14:45:00

R440 , Astronomy and Mathematics Building

Model selection procedures often use a fixed penalty, such as Mallows' Cp, to avoid choosing a model which fits a particular data set extremely well. These procedures are often devised to give a unbiased risk estimate when a partcular chosen model is used to predict future responses. As a correction for not including the variability induced in model selection, a concept of generalized degrees of freedom is introduced in Ye (1998) which leads to a data-adaptive complexity penalty in Shen and Ye (2002). In this article, we evaluate whether such an approach leads to a good model in terms of model selection consistency for linear regression when the considered candidate models are nested. In addition, an evaluation of data perturbation approach in Shen and Ye (2002) is also provided.