Generally speaking, a Boltzmann machine is a type of Hopfield network in which whether or not individual neurons are activated at each step is determined partially randomly. In its original form where all neurons are connected to all other neurons, a Boltzmann machine is of no practical use for similar reasons as Hopfield networks in general.
A restricted Boltzmann machine, on the other hand, consists of an input layer and a single hidden layer whose neurons are randomly initialized. Every node in the input layer is connected to every node in the hidden layer, but there are no connections within either of the layers. The aim is similar to that of an autoencoder: the nodes in the hidden layer should learn the salient features from a training set. If the input is pictorial, the learned features can be visualised by stimulating each hidden neuron in turn and recovering the input from the input layer.
The weights learned by a restricted Boltzmann machine can be used to initialize a layer within a deep perceptron.
The main difference between an autoencoder and a restricted Boltzmann machine is that an autoencoder is trained using simple backpropagation, while a restricted Boltzmann machine is trained using energy minimization.
A Gaussian-binary Restricted Boltzmann Machine or GRBM extends the architecture to continuous input data using a Gaussian classification model. Either the model is used to yield categorical data that is then fed to the neural network, or the probabilities of each classification are directly fed to the input layer as weights.
- alias
- subtype
- Gaussian-binary Restricted Boltzmann Machine GRBM
- has functional building block
- FBB_Feature discovery
- has input data type
- IDT_Binary vector IDT_Vector of quantitative variables
- has internal model
- INM_Neural network
- has output data type
- has learning style
- LST_Unsupervised
- has parametricity
- PRM_Nonparametric with hyperparameter(s)
- has relevance
- REL_Relevant
- uses
- sometimes supports
- ALG_Perceptron
- mathematically similar to