Not Logged In

Stochastic Neural Networks with Monotonic Activation Functions

Full Text: ravanbakhsh16.pdf PDF

We propose a Laplace approximation that creates a stochastic unit from any smooth monotonic activation function, using only Gaussian noise. This paper investigates the application of this stochastic approximation in training a family of Restricted Boltzmann Machines (RBM) that are closely linked to Bregman divergences. This family, that we call exponential family RBM (Exp-RBM), is a subset of the exponential family Harmoniums that expresses family members through a choice of smooth monotonic non-linearity for each neuron. Using contrastive divergence along with our Gaussian approximation, we show that Exp-RBM can learn useful representations using novel stochastic units.

Citation

S. Ravanbakhsh, B. Poczos, J. Schneider, D. Schuurmans, R. Greiner. "Stochastic Neural Networks with Monotonic Activation Functions". Artificial Intelligence and Statistics, (ed: Arthur Gretton, Christian C. Robert), pp 809-818, May 2016.

Keywords: graphical models, neural networks, activiation functions
Category: In Conference
Web Links: PMLR

BibTeX

@incollection{Ravanbakhsh+al:AISTATS16,
  author = {Siamak Ravanbakhsh and Barnabas Poczos and Jeff Schneider and Dale
    Schuurmans and Russ Greiner},
  title = {Stochastic Neural Networks with Monotonic Activation Functions},
  Editor = {Arthur Gretton, Christian C. Robert},
  Pages = {809-818},
  booktitle = {Artificial Intelligence and Statistics},
  year = 2016,
}

Last Updated: February 11, 2020
Submitted by Sabina P

University of Alberta Logo AICML Logo