Not Logged In

Convex Deep Learning via Normalized Kernels

Full Text: nips14.pdf PDF

Deep learning has been a long standing pursuit in machine learning, which until recently was hampered by unreliable training methods before the discovery of improved heuristics for embedded layer training. A complementary research strategy is to develop alternative modeling architectures that admit efficient training methods while expanding the range of representable structures toward deep models. In this paper, we develop a new architecture for nested nonlinearities that allows arbitrarily deep compositions to be trained to global optimality. The approach admits both parametric and nonparametric forms through the use of normalized kernels to represent each latent layer. The outcome is a fully convex formulation that is able to capture compositions of trainable nonlinear layers to arbitrary depth.

Citation

O. Aslan, X. Zhang, D. Schuurmans. "Convex Deep Learning via Normalized Kernels". Neural Information Processing Systems (NIPS), (ed: Zoubin Ghahramani, Max Welling, Corinna Cortes, Neil D. Lawrence, Kilian Q. Weinberger), pp 3275-3283, December 2015.

Keywords:  
Category: In Conference
Web Links: NeurIPS

BibTeX

@incollection{Aslan+al:NIPS15,
  author = {Ozlem Aslan and Xinhua Zhang and Dale Schuurmans},
  title = {Convex Deep Learning via Normalized Kernels},
  Editor = {Zoubin Ghahramani, Max Welling, Corinna Cortes, Neil D. Lawrence,
    Kilian Q. Weinberger},
  Pages = {3275-3283},
  booktitle = {Neural Information Processing Systems (NIPS)},
  year = 2015,
}

Last Updated: February 14, 2020
Submitted by Sabina P

University of Alberta Logo AICML Logo