Optimal classifiers with minimum expected error within a Bayesian framework — Part II: Properties and performance analysis

Filed Under : | | | |

Type: Journal Publication

Web: http://www.sciencedirect.com/science/article/pii/S0031320312004530


Abstract: In part I of this two-part study, we introduced a new optimal Bayesian classification methodology that utilizes the same modeling framework proposed in Bayesian minimum-mean-square error (MMSE) error estimation. Optimal Bayesian classification thus completes a Bayesian theory of classification, where both the classifier error and our estimate of the error may be simultaneously optimized and studied probabilistically within the assumed model. Having developed optimal Bayesian classifiers in discrete and Gaussian models in part I, here we explore properties of optimal Bayesian classifiers, in particular, invariance to invertible transformations, convergence to the Bayes classifier, and a connection to Bayesian robust classifiers. We also explicitly derive optimal Bayesian classifiers with non-informative priors, and explore relationships to linear and quadratic discriminant analysis (LDA and QDA), which may be viewed as plug-in rules under Gaussian modeling assumptions. Finally, we present several simulations addressing the robustness of optimal Bayesian classifiers to false modeling assumptions. Companion website: http://gsp.tamu.edu/Publications/supplementary/dalton12a.


Cited as: Dalton, L., and Dougherty, E. R., "Optimal classifiers with minimum expected error within a Bayesian framework — Part II: Properties and performance analysis", Pattern Recognition, Vol. 46, No. 5, 1301-1314, 2013

Comments are closed.