Not Logged In

Support Vector Machines on General Confidence Functions

We present a generalized view of support vector machines that does not rely on a Euclidean geometric interpretation nor even positive semidefinite kernels. We base our development instead on the confidence matrix—the matrix normally determined by the direct (Hadamard) product of the kernel matrix with the label outer-product matrix. It turns out that alternative forms of confidence matrices are possible, and indeed useful. By focusing on the confidence matrix instead of the underlying kernel, we can derive an intuitive principle for optimizing example weights to yield robust classifiers. Our principle initially recovers the standard quadratic SVM training criterion, which is only convex for kernel-derived confidence measures. However, given our generalized view, we are then able to derive a principled relaxation of the SVM criterion that yields a convex upper bound. This relaxation is always convex and can be solved with a linear program. Our new training procedure obtains similar generalization performance to standard SVMs on kernel-derived confidence functions, but achieves even better results with indefinite confidence functions.

Citation

Y. Guo, D. Schuurmans. "Support Vector Machines on General Confidence Functions". January 2005.

Keywords: vector, machine learning
Category:  

BibTeX

@incollection{Guo+Schuurmans:05,
  author = {Yuhong Guo and Dale Schuurmans},
  title = {Support Vector Machines on General Confidence Functions},
  year = 2005,
}

Last Updated: January 04, 2007
Submitted by William Thorne

University of Alberta Logo AICML Logo