Week 5: Hopfield Network = CHECKED Flashcards
Why associative memory models? - (4)
- Introduce learning
- Introduce fundamental ideas about associating patterns of neural activity
- Associating patterns or sequences of patterns is needed for episodic memory
- The hippocampal anatomy maps very well onto these ideas
The Hopfield Network uses neurons that are very simple which are the
standard artificial neurons with no dynamics
Representation of the Hopfield (1982) Associative Memory Network:
Representation of the Hopfield (1982) Associative Memory Network shows (2)
all the neurons are connected with each other
Neuron Si is connected to Sj with weight of wij
Assumption of the Hopfield Associative Memory Network
Assume a fully connected network with symmetric connections (Wij = Wji)
Properties of the Hopfield (1982) Associative Memory Network (5)
- Simple connectionist neurons
- No dynamics
- We impose the update schelude
- Sign function as a transfer function
- Units can be active (Si = 1) or inactive (Si = -1)
In Hopfield network, the sign function as transfer function meaning: (2)
If a value is below 0 then it is set to -1
If a value is above 0 then set it to 1
Equation of activity of neuron in Hopfield Associative Memory Network
What do we mean by symmetric connections in Hopfield Associative Memory Network?
We mean that weight in one direction is the same as the weight in other direction
In Hebbian learning it proposes that
neurons that “fire together wire together”
Hebbian learning proposes that:
neurons that “fire together, wire together” which in other words
meaning, if sender and receiver are both active (3)
- Sender likely contributed to making the receiver fire!
- Thus, it strengthens the connection between sender and receiver
- That is the weight increases
In Hebbian learning,
changing the weights of synaptic connections between neurons mathematically in Hopfield Network by
We take one of the weight (Wij) and add to it the product of activity of pre and post synaptic neurons (if both active) times this very tiny number (epilson)
The symbol ε in weight equation means: (2)
- A tiny number as we don’t want to change the the weights in the network too quickly
- Most cases you want to incrementally learn something new (so have multiple presentations of two stimuli to associate them together)
The first step of the Hopfield network learning
impose a pattern we want to learn then let the learning rule act
What do we mean by imposing a pattern?
To impose a pattern, we clamp the activity of a subset of neurons for one pattern and let the learning rule act to change those synaptic weight connections in the network
Diagram of example of imposing a pattern for instance Pattern 1 - (6)
Pattern 1 a given number of neurons are active (orange) and keep them active
By saying these activity (i.e., firing state) of neurons can not be updated
Then we let the learning rule act between all these neurons
Connection between blue and orange neurons not going to be strengthened (-1 [non-active] * 1 [active] = -1)
Connections between orange and orange neurons connection is strengthed (1 * 1 = 1) so that in future we don’t force these neurons to be active as one neuron makes the other one fire.
Connection between two neurons that are silent (two blue = -1 * - 1 = -1) so weight increases meaning that neuron silences the other one.
Learnign rule table act when imposing a pattern table:
neurons (3)
- Both -1, weight goes up = connec strengthened
- Both 1, weight goes up = connec strengthened
- Mixed, weight goes down, may lead to pruning of connections.
With more neurons in Hopfield network (imposing pattern letting learning rule act)
We can have many more patterns
In the hopfield network, patterns of activations are (2)
learned as ‘stable states’ under the rule for updating activations
If we clamp activity of neurons for one pattern, (some active some silent) and let learning rule act the weights will change until there is no more change in set of active neurons
Stable states mean
update rule produces no more changes in active neurons
When pattern of activation does not change anymore we say…
we say a stable state has been reached
We can update learning rule for units in Hopfield network model in two ways: 2)
Asynchronously: One unit is updated at a time, picked at random or a pre-defined order
Synchronously: All units are updated at the same time.
In update rule, many different
patterns can be learned in the same network, but the memory capacity is limited to ~ 0.14N (N is the number of neurons) in Hopfield network
Memory in the Hopfield network is
“content addressable”, performing “pattern completion of a partial cue”