Hopfield and attractor networks Flashcards
(13 cards)
hopfield network short
a fully connected recurrent networke
recurrent
each neuron is connected to every other neuron
hopfield mechanism
stores memories (patterns of -1 and +1) by creating a kind of energy landscape. connections between neurons are set up in a symmetric way, meaning the influence from A to B is the same from B to A, which allows the network to have a well defined energy function, which always decreases as the network updates.
Instead of looking up a memory like a dictionary, the hopfield network dynamicaaly finds it based on similarity.op
hopfield valley
its like an energy landscape, a hilly terrain with valleys. each valley represents a memory the network has learned. these valleys are the stable states of the network, once you flal into one, you stay there.
imagine you drop a ball somewhere in the landscape, not exactly in a valley but nearby (a noisy or incoplete version of a memory). the ball will roll downhill following the slope, unti it settles into the nearest valley.
how is a hopfield model a dynamic system
- the state of each neuron can change at each step
- the network updates neuron values one at a time, or all together
- with every update, the total energy of the system decreases
- eventually the system stops changing - it has reached a stable point
so the dynamics of the system are the step by step changes in neuron activity, driven by rules that guarantee the system eventyally stops moving
what is a dynamic system
something that evolves over time based on a set of rules
attractor
neural networks that evolve their activity/dynamics over time to settle int ostable states called attractors
discrete attractor
a single points or several discrete points form the stable states. can be visualizes as valleys in the energy landscpaesco
continuous attractors (ring attractors)
a continuum of stable states. the possible attractor states are no longer discrete, but can vary continuously
activity bump and how does it move
nearby cells excite eachother.
moving:
activity copy: the current position of the activity bymp is copied ito a hidden layer of neurons
movement signals: the hidden layer also receives movement signals
asymmetric projections: the hidden layer sends asymmetric projections back to the attractor layer, nudging the bump in the appropriate direction
conjunctive cells: the neurons in the hidden layer are conjunctive cells that combine the current bump position and the movement signal, encoding both together
steering the bump: this combinatin of position and movement allows the netwrok to update the head direciton smoothly, following the intendend movement
spurious attractors
with too many memories, the network might recall mictures of patterns rather than a single correct one
max number of memories
N_patterns = 0.138 times N_neurons
continous ring attractor characteristics
- layer of conjunctive cells (head direction x angluar velocity
- pairwise correlations between head direction neurons should be preserved across conditions
- persistent activity: activity should not die off when externl input is removed
- population activity should be constrained to a ring-like manifold (ring attractor)
- specific connectivity between the head direction cells (local excitation, global inhibition)