Using Query-Specific Variance Estimates to Combine Bayesian Classifiers
- Chi-Hoon Lee, Dept of Computing Science
- Russ Greiner, Dept of Computing Science; PI of AICML
- Shaojun Wang, Dept of Computing Science
Other Attachments: | ICML06-Poster.pdf |
Our experimental results show that these 'mixture-using-variance belief net classifiers' MUVs work effectively, especially when the base classifiers are learned using balanced bootstrap samples and when their results are combined using James-Stein shrinkage. We also found that our variance-based combination rule performed better than both AdaBoost and bagging, even on the set of base classifiers produced by AdaBoost itself. Finally, this framework is extremely efficient, as both the learning and the evaluation components require only straight-line code.
Citation
C. Lee, R. Greiner, S. Wang. "Using Query-Specific Variance Estimates to Combine Bayesian Classifiers". International Conference on Machine Learning (ICML), Pittsburgh, June 2006.Keywords: | Mixture using Variance, Belief Net, Variance, Ensemble method, machine learning |
Category: | In Conference |
Web Links: | Webpage |
BibTeX
@incollection{Lee+al:ICML06, author = {Chi-Hoon Lee and Russ Greiner and Shaojun Wang}, title = {Using Query-Specific Variance Estimates to Combine Bayesian Classifiers}, booktitle = {International Conference on Machine Learning (ICML)}, year = 2006, }Last Updated: April 01, 2007
Submitted by Russ Greiner