On using mixtures and modes of mixtures in data analysis
My thesis includes two topics: modal local polynomial regression and label switching for Bayesian mixtures.
Modal Local Polynomial Regression. By combining the ideas of local polynomial regression (LPR) and modal regression, we created a new adaptive robust nonparametric regression method—"modal local polynomial regression (MLPR)." We have successfully proved that asymptotically MLPR produces smaller mean square error (MSE) than LPR when there are outliers or when the error distribution has a heavier tail than the normal distribution. Furthermore, unlike other general robust methods, this new method achieves robustness without sacrificing efficiency. In fact, in cases where there are no outliers or where the error distribution has a light tail (e.g. Gaussian distribution), MLPR produces results that are at least as good as the local polynomial method.
By adding one more tuning parameter, MLPR performs better than the traditional LPR.
Label Switching for Bayesian Mixtures. One of the most fundamental problems for Bayesian mixture model estimation is label switching. We mainly propose two methods to solve this problem. One solution is to use the modes of the posterior distribution to do labelling. In order to find the posterior modes, we successfully created an algorithm to find the posterior modes of Bayesian mixtures by using the ideas of ECM (Meng and Rubin, 1993) and Gibbs sampler. This labelling method creates a natural and intuitive partition of the parameter space into labelled regions and has a nice explanation based on the highest posterior region (HPD). The other main solution is to do labelling by minimizing the normal likelihood of the labelled Gibbs samples. Unlike order constraint method, this new method can be easily extended to high dimension case and is scale invariant to the component parameters. In addition, this labelling method can be also used to solve label switching in frequentist case. (Abstract shortened by UMI.)