No. 211 Nonparametric Regression Density Estimation Using Smoothly Varying Normal Mixtures

by Mattias Villani, Robert Kohn and Paolo Giordani

 

September 2007

 

Abstract

We model a regression density nonparametrically so that at each value of the covariates the density is a mixture of normals with the means, variances and mixture probabilities of the components changing smoothly as a function of the covariates. The model extends existing models in two important ways. First, the components are allowed to be heteroscedastic regressions as the standard model with homoscedastic regressions can give a poor fit to heteroscedastic data, especially when the number of covariates is large. Furthermore, we typically need a lot fewer heteroscedastic components, which makes it easier to interpret the model and speeds up the computation. The second main extension is to introduce a novel variable selection prior into all the components of the model. The variable selection prior acts as a self adjusting mechanism that prevents overfitting and makes it feasible to fit high dimensional nonparametric surfaces. We use Bayesian inference and Markov Chain Monte Carlo methods to estimate the model. Simulated and real examples are used to show that the full generality of our model is required to fit a large class of densities.

 

Keywords

Bayesian inference, Markov Chain Monte Carlo, Mixture of Experts, Predictive inference, Splines, Value-at-Risk, Variable selection.

Last reviewed

Content expert

Contact content expert

Fill in the information

To minimize automated spam, please answer the question in the box below.

7 + 7 ?