Restricted Boltzmann machines for pre-training deep Gaussian networks

Mark Eastwood, C. Jayne

Research output: Contribution to conferencePaper

8 Citations (Scopus)


A Restricted Boltzmann Machine (RBM) is proposed with an energy function which we show results in hidden node activation probabilities which match the activation rule of neurons in a Gaussian synapse neural network. This makes the proposed RBM a potential tool in pre-training a Gaussian synapse network with a deep architecture, in a similar way to how RBMs have been used in a greedy layer wise pre-training procedure for deep neural networks with scalar synapses. Using experimental examples, we investigate the training characteristics of this form of RBM and discuss its suitability for pre-training of a deep Gaussian synapse network. While this is the most direct route to a deep Gaussian synapse network, we explain and discuss a number of issues found in using the proposed form of RBM in this way, and suggest possible soutions
Original languageEnglish
Publication statusPublished - 2013
EventThe 2013 International Joint Conference on Neural Networks - Dallas, United States
Duration: 4 Aug 20139 Aug 2013


ConferenceThe 2013 International Joint Conference on Neural Networks
Abbreviated titleIJCNN
Country/TerritoryUnited States

Bibliographical note

The full text is currently unavailable on the repository.


  • Boltzmann machines
  • Gaussian processes
  • probability
  • Gaussian synapse neural network
  • RBM
  • activation rule
  • energy function
  • hidden node activation probabilities
  • neurons
  • pretraining deep Gaussian networks
  • restricted Boltzmann machines
  • scalar synapses
  • Artificial neural networks
  • Biological neural networks
  • Equations
  • Image reconstruction
  • Mathematical model
  • Neurons
  • Training


Dive into the research topics of 'Restricted Boltzmann machines for pre-training deep Gaussian networks'. Together they form a unique fingerprint.

Cite this