Not Logged In

Using Query-Specific Variance Estimates to Combine Bayesian Classifiers

Full Text: FINAL.pdf PDF
Other Attachments: ICML06-Poster.pdf PDF

Many of today's best classification results are obtained by combining the responses of a set of base classifiers to produce an answer for the query. This paper explores a novel 'query specific' combination rule: After learning a set of simple belief network classifiers, we produce an answer to each query by combining their individual responses, using weights based inversely on their respective variances around their responses. These variances are based on the uncertainty of the network parameters, which in turn depend on the training datasample. In essence, this variance quantifies the base classifier's confidence of its response to this query.

Our experimental results show that these 'mixture-using-variance belief net classifiers' MUVs work effectively, especially when the base classifiers are learned using balanced bootstrap samples and when their results are combined using James-Stein shrinkage. We also found that our variance-based combination rule performed better than both AdaBoost and bagging, even on the set of base classifiers produced by AdaBoost itself. Finally, this framework is extremely efficient, as both the learning and the evaluation components require only straight-line code.

Citation

C. Lee, R. Greiner, S. Wang. "Using Query-Specific Variance Estimates to Combine Bayesian Classifiers". International Conference on Machine Learning (ICML), Pittsburgh, June 2006.

Keywords: Mixture using Variance, Belief Net, Variance, Ensemble method, machine learning
Category: In Conference
Web Links: Webpage

BibTeX

@incollection{Lee+al:ICML06,
  author = {Chi-Hoon Lee and Russ Greiner and Shaojun Wang},
  title = {Using Query-Specific Variance Estimates to Combine Bayesian
    Classifiers},
  booktitle = {International Conference on Machine Learning (ICML)},
  year = 2006,
}

Last Updated: April 01, 2007
Submitted by Russ Greiner

University of Alberta Logo AICML Logo