Abstract
A Restricted Boltzmann Machine (RBM) is proposed with an energy function which we show results in hidden node activation probabilities which match the activation rule of neurons in a Gaussian synapse neural network. This makes the proposed RBM a potential tool in pre-training a Gaussian synapse network with a deep architecture, in a similar way to how RBMs have been used in a greedy layer wise pre-training procedure for deep neural networks with scalar synapses. Using experimental examples, we investigate the training characteristics of this form of RBM and discuss its suitability for pre-training of a deep Gaussian synapse network. While this is the most direct route to a deep Gaussian synapse network, we explain and discuss a number of issues found in using the proposed form of RBM in this way, and suggest possible soutions
Original language | English |
---|---|
Pages | 1-8 |
DOIs | |
Publication status | Published - 2013 |
Event | The 2013 International Joint Conference on Neural Networks - Dallas, United States Duration: 4 Aug 2013 → 9 Aug 2013 |
Conference
Conference | The 2013 International Joint Conference on Neural Networks |
---|---|
Abbreviated title | IJCNN |
Country/Territory | United States |
City | Dallas |
Period | 4/08/13 → 9/08/13 |
Bibliographical note
The full text is currently unavailable on the repository.Keywords
- Boltzmann machines
- Gaussian processes
- probability
- Gaussian synapse neural network
- RBM
- activation rule
- energy function
- hidden node activation probabilities
- neurons
- pretraining deep Gaussian networks
- restricted Boltzmann machines
- scalar synapses
- Artificial neural networks
- Biological neural networks
- Equations
- Image reconstruction
- Mathematical model
- Neurons
- Training