Not Logged In

Maximum Margin Clustering

We propose a new method for clustering based on finding maximum mar­ gin hyperplanes through data. By reformulating the problem in terms of the implied equivalence relation matrix, we can pose the problem as a convex integer program. Although this still yields a difficult com­ putational problem, the hard­clustering constraints can be relaxed to a soft­clustering formulation which can be feasibly solved with a semidef­ inite program. Since our clustering technique only depends on the data through the kernel matrix, we can easily achieve nonlinear clusterings in the same manner as spectral clustering. Experimental results show that our maximum margin clustering technique often obtains more accurate results than conventional clustering methods. The real benefit of our ap­ proach, however, is that it leads naturally to a semi­supervised training method for support vector machines. By maximizing the margin simul­ taneously on labeled and unlabeled training data, we achieve state of the art performance by using a single, integrated learning principle.

Citation

L. Xu, J. Neufeld, B. Larson, D. Schuurmans. "Maximum Margin Clustering". Neural Information Processing Systems (NIPS), Vancouver, British Columbia, Canada, January 2004.

Keywords: clustering, machine learning
Category: In Conference

BibTeX

@incollection{Xu+al:NIPS04,
  author = {Linli Xu and James Neufeld and Bryce Larson and Dale Schuurmans},
  title = {Maximum Margin Clustering},
  booktitle = {Neural Information Processing Systems (NIPS)},
  year = 2004,
}

Last Updated: April 24, 2007
Submitted by Christian Smith

University of Alberta Logo AICML Logo