Hopfield network

As Hopfield network is called a special form of an artificial neural network. It is named after the American scientist John Hopfield, who introduced the model 1982.

Structure

Hopfield networks belong to the class of feedback networks (networks with feedback). In a Hopfield network, only one layer, which acts as the input and output layer exists. Each of the binary McCulloch - Pitts neuron is connected to each other than themselves. The neurons can assume the values ​​-1 and 1, which states " not fire " and " fire " or higher.

In Hopfield networks, the synaptic weights are symmetric, ie it is valid for all i and j. While this is not biologically meaningful, but allowed the setting up of an energy function and the analysis of networks with methods of statistical mechanics.

Since the vectors for the input and output are the same length, one also speaks of a car association network.

Operation

In implementing a Hopfieldnetzwerkes the question arises whether the weights of the neurons to be changed synchronously or asynchronously.

  • Synchronous change means that in an iteration step all neurons are updated simultaneously.
  • Asynchronous change means that a neuron chosen at random and calculated, and the value is immediately taken into account in the next calculation.

Asynchronous Change the Hopfield network is the most widespread.

Pattern recovery with Hopfieldnetzen

Hopfield networks can be used as an auto- associative memory in order to reconstruct the noisy or only partially known pattern. This takes place in three phases:

Training phase

Here the network is stored a number L of predefined patterns. This is done by adjusting the synaptic weights. So Wanted a suitable weight matrix of size N × N, which is symmetrical. It can be calculated, for example, in one step with the following rule, which is also referred to as a generalized Hebbian learning rule:

In which

One would generally feed as many different patterns in a Hopfield. However, the storage capacity is limited by the ratio.

Inputting a test pattern

Now it is a test pattern, such as a noisy or incomplete image into the network. For this one simply sets the neurons in the state corresponding to the test pattern.

Computational phase

The neurons are updated asynchronously with the following rule: where the state of the neuron to be updated and a threshold value.

The result could be one depending on the number of iteration steps more or less well denoised image in this case. Up to a ratio (ratio einzuspeichernder patterns of neurons Hopfield network ) guarantees the Hebbian rule that the system does not change when it is reached in a state corresponding to one of the stored patterns. It can also show that the system always arrives in a stable final state.

The three final states are possible:

  • The pattern was detected correctly.
  • The inverted pattern is detected.
  • There can be no pattern is detected, the network enters a stable false state does not correspond to the pattern.

Relation to statistical mechanics

For the Hopfield model exists an energy function of the form, the value of which, as can be proved, decreases at each update according to the above rule. Only the stable patterns ( and the spurious states) remains the same, the energy, so these provide local minima of the energy landscape dar.

There is a connection between the Hopfieldmodell and the Ising model, valid for its energy. In particular to spin glasses, which are randomly distributed, there is a great similarity. It could be shown by methods of theoretical physics that Hopfield networks only up to a ratio can be used as an associative memory.

398602
de