Not Logged In

Implicit Online Learning with Kernels

Full Text: CheVisSchWanetal06.pdf PDF

We present two new algorithms for online learning in reproducing kernel Hilbert spaces. Our first algorithm, ILK (implicit online learning with kernels), employs a new, implicit update technique that can be applied to a wide variety of convex loss functions. We then introduce a bounded memory version, SILK (sparse ILK), that maintains a compact representation of the predictor without compromising solution quality, even in non-stationary environments. We prove loss bounds and analyze the convergence rate of both. Experimental evidence shows that our proposed algorithms outperform current methods on synthetic and real data.

Citation

L. Cheng, S. Vishwantathan, D. Schuurmans, S. Wang, T. Caelli. "Implicit Online Learning with Kernels". Neural Information Processing Systems (NIPS), January 2006.

Keywords: machine learning
Category: In Conference

BibTeX

@incollection{Cheng+al:NIPS06,
  author = {Li Cheng and S.V.N. Vishwantathan and Dale Schuurmans and Shaojun
    Wang and Terry Caelli},
  title = {Implicit Online Learning with Kernels},
  booktitle = {Neural Information Processing Systems (NIPS)},
  year = 2006,
}

Last Updated: April 24, 2007
Submitted by Nelson Loyola

University of Alberta Logo AICML Logo