Not Logged In

Learning Continuous Latent Variable Models With Bregman Divergences

Full Text: alt03.ps PS

We present a class of unsupervised statistical learning algo- rithms that are formulated in terms of minimizing Bregman divergences| a family of generalized entropy measures with convex functions. We ob- tain novel training algorithms that extract hidden latent structure by minimizing a Bregman divergence on training data, subject to a set of non-linear constraints which consider hidden variables. An alternating minimization procedure with nested iterative scaling is proposed to nd feasible solutions for the resulting constrained optimization problem. The convergence of this algorithm along with its information geometric prop- erties are characterized.

Citation

S. Wang, D. Schuurmans. "Learning Continuous Latent Variable Models With Bregman Divergences". ICASSP, October 2003.

Keywords: Bregman, divergence, machine learning
Category: In Conference

BibTeX

@incollection{Wang+Schuurmans:ICASSP03,
  author = {Shaojun Wang and Dale Schuurmans},
  title = {Learning Continuous Latent Variable Models With Bregman Divergences},
  booktitle = {},
  year = 2003,
}

Last Updated: June 01, 2007
Submitted by Staurt H. Johnson

University of Alberta Logo AICML Logo