Not Logged In

Boosting in the limit: Maximizing the margin of learned ensembles

Full Text: grove98boosting.pdf PDF

The "minimum margin" of an ensemble classifier on a given training set is, roughly speaking, the smallest vote it gives to any correct training label. Recent work has shown that the Adaboost algorithm is particularly effective at producing ensembles with large minimum margins, and theory suggests that this may account for its success at reducing generalization error. We note, however, that the problem of finding good margins is closely related to linear programming, and we use this

Citation

A. Grove, D. Schuurmans. "Boosting in the limit: Maximizing the margin of learned ensembles". National Conference on Artificial Intelligence (AAAI), June 1998.

Keywords:  
Category: In Conference

BibTeX

@incollection{Grove+Schuurmans:AAAI98,
  author = {Adam Grove and Dale Schuurmans},
  title = {Boosting in the limit: Maximizing the margin of learned ensembles},
  booktitle = {National Conference on Artificial Intelligence (AAAI)},
  year = 1998,
}

Last Updated: June 01, 2007
Submitted by Staurt H. Johnson

University of Alberta Logo AICML Logo