Hopfield Network

Algorithm

A Hopfield Network is one of the simplest and oldest types of neural network.

A Hopfield network does not distinguish between different types of neurons (input, hidden, and output). Rather, the same neurons are used both to enter input and to read off output. Each neuron has a binary value of either +1 or -1 (not 1 or 0!) and every neuron is linked to every other neuron by scalar weights with values between -1 and +1.

The classic training method for a Hopfield network (although there are alternatives) is known as Hebbian learning. Whenever two neurons have the same value in a training data set, the weight between them is increased slightly; whenever two neurons have a different value, the weight between them is decreased slightly.

When input data is presented to the network, the aim is that it should adjust to the item from the training data that is most similar to the input, so that a Hopfield network could be used to normalize handwritten letters, classifying them as a side-effect. The normalization consists of the network being updated typically one node at a time (the order being either random or predetermined) and the value of each node being recalculated by adding together the values of all the weights that link it to all other nodes in the network. This continues until the network converges on an optimum which can be understood as a minimum energy state.

Unfortunately, Hopfield networks are of little use in practice because they tend to learn local optima that did not form part of the original training data. For example, a Hopfield network normalizing handwritten letters might tend to invent additional letters that then form part of the set of possible classifications.

Restricted Boltzmann Machines are, however, an extension of the Hopfield network concept that work well and that have practical uses.

Alias
Hopfield net
Related terms
Neural Network Hebbian Learning Restricted Boltzmann Machine