Hopfield network


A Hopfield network is one of the simplest and oldest types of neural network. It does not distinguish between different types of neurons (input, hidden and output). Rather, the same neurons are used both to enter input and to read off output. Each neuron has a binary value of either +1 or -1 (not +1 or 0!) and the every neuron is linked to every other neuron by scalar weights with values between -1 and +1.

The classic training method for a Hopfield network (although there are alternatives) is known as Hebbian learning. Whenever two neurons have the same value in a training data set, the weight between them is increased slightly; whenever two neurons have a different value, the weight between them is decreased slightly.

When input data is presented to the network, the aim is that it should adjust to the item from the training data that is most similar to the input, so that a Hopfield network could be used to normalise handwritten letters, classifying them as a side-effect. The normalization consists of the network being updated typically one node at a time (the order being either random or predetermined) and the value of each node being recalculated by adding together the values of all the weights that link it to all other nodes in the network. This continues until the network converges on an optimum which can be understood as a minimum energy state.

Unfortunately, Hopfield networks are of little use in practice because they tend to learn local optima that did not form part of the original training data. For example, a Hopfield network normalising handwritten letters might tend to invent additional letters that then form part of the set of possible classifications.

Restricted Boltzmann machines are however an extension of the Hopfield network concept that work well and that have practical uses.

Hopfield net
has functional building block
has input data type
IDT_Binary vector
has internal model
INM_Neural network
has output data type
ODT_Binary vector
has learning style
has parametricity
PRM_Nonparametric with hyperparameter(s)
has relevance
sometimes supports
mathematically similar to