SeminarsBayesian subset selections in multiple regression models
reads
Albert Lo
2009-04-13
09:30:00 - 11:00:00
405 , Mathematics Research Center Building (ori. New Math. Bldg.)
The selection of predictors to include is an important problem in building a multiple regression model. The Bayesian approach to statistical inference simply converts the inference problem into the elementary problem of evaluating conditional (i.e., posterior) distributions and is rather straight forward. This approach often assumes a normal error, which is a severe restriction. In particular, heavy-tailed error densities are excluded. The Bayesian mixture method can be used to relax this restriction to allow for seemingly more realistic errors that are unimodal and/or symmetric. The main thrust of this method essentially reduces an infinite-dimensional stochastic analysis problem of averaging random distributions to a finite-dimensional one based on averaging random partitions. The posterior distribution of the parameters is an average of random partitions, and a MCMC based on nesting a Metropolis-Hastings algorithm in the weighted Chinese restaurant process of sampling partitions results in a stochastic search for the posterior mode of the parameters. Numerical examples are given. (Joint work with Ms Baoqian Pao.)