Not Logged In

Robust Support Vector Machine Training Via Convex Outlier Ablation

Full Text: AAAI0613XuL.pdf PDF

One of the well known risks of large margin training methods,such as boosting and support vector machines (SVMs),is their sensitivity to outliers. These risks are normally mitigatedby using a soft margin criterion, such as hinge loss, toreduce outlier sensitivity. In this paper, we present a more directapproach that explicitly incorporates outlier suppressionin the training process. In particular, we show how outlier detectioncan be encoded in the large margin training principleof support vector machines. By expressing a convex relaxationof the joint training problem as a semidefinite program,one can use this approach to robustly train a support vectormachine while suppressing outliers. We demonstrate that ourapproach can yield superior results to the standard soft marginapproach in the presence of outliers.

Citation

L. Xu, K. Crammer, D. Schuurmans. "Robust Support Vector Machine Training Via Convex Outlier Ablation". National Conference on Artificial Intelligence (AAAI), Boston, Massachusetts, USA, January 2006.

Keywords: vector, convex, outlier, ablation, machine learning
Category: In Conference

BibTeX

@incollection{Xu+al:AAAI06,
  author = {Linli Xu and Koby Crammer and Dale Schuurmans},
  title = {Robust Support Vector Machine Training Via Convex Outlier Ablation},
  booktitle = {National Conference on Artificial Intelligence (AAAI)},
  year = 2006,
}

Last Updated: June 01, 2007
Submitted by Staurt H. Johnson

University of Alberta Logo AICML Logo