8 - Recurrent Neural Networks Flashcards
(48 cards)
What is the problem with using a time as space model?
-expensive
-how far back in time should you go?
-semantically not correct
How to deal with problems in time as space model?
allow activity to reverberate through the network by using RECURRENT CONNECTIONS -> dynamic network activity
What analogy can you use to explain reverberating activity?
throwing stone in puddle and wave of puddle can tell you where input(stone) came from
In Elman Networks specifically, what is the purpose of feedback weights?
connect the hidden layer to the context layer
Elman Network - What sort of activity is seen between hidden layer and context layer?
reverberating activity
Why cant a perceptron (single rate neuron model) perform an XOR function?
output is not linearly separable
What can solve the XOR problem?
with added layers of neurons !!
Are there any feedback connections in the feed forward model ? (outputs which feedback into itself)
noooooooooooooo (called feedforward rate networks for a reason)
What is the added additional layer in the Elman Network compared to a Feedforward Network?
context layer
What tool does learning sequence information use to predict the sequential XOR INPUT?
uses backpropagation
When the error is reduced does the ability of the network to predict the next letter/number in the sequence increase or decrease?
increases
Word with similar semantic have similar or different pattern of activity? (with predicting letter and level or error)
similar semantics and similar patterns
What does adding recurrent connections between layers (reverberation activity) allow?
-introduces a new kind of ‘memory’ capacity which enables networks to represent temporal structure in the sequences of training patterns
Why can you read a text which has letters omitted in the words?
What sort of memory is this called?
What network is this memory seen in?
because there is a pattern in language which the brain has stored (Elman)
-associative memory
-Hopfield Network
What is cell assembly in memory?
What is a cell assembly able to do?
-a selection of neurons connected with increased synaptic weights
-this increase in synaptic weights of connections between neurons allows you to store an item in memory
How do cell assemblies contribute to associate memory? (Hebbian)
-items stored via creation of cell assembly
-associated items can be recalled due to activation of cell assembly (this is due to the increase in synaptic weights between neurons in Hebbian Network)
What learning is this: neurons that fire together, wire together?
What does this mean?
Hebbian Learning: neurons that are active together results in strengthened connections between them
In ferromagnets, is the direction of the atom’s spin independent on the neighbouring atom’s spin?
no they are not independent - they can have strong or weak interactions which influence eachother
If a ferromagnet atom gets excited and spins so to point in the opposite direction, what happens afterwards if you apply the Hebbian dynamics eqn.?
How does this relate to Hebbian Learning/Network?
-atom flips back to original orientation
-an example of recalling memory is when the atom flips back as the original orientation is the ‘stored’ memory
What is the definition of a Hopfield Network?
a recurrent network with symmetric weights and no self-connections
In the definition of a Hopfield Network, what does it mean when the weights are symmetric?
the connection from neuron j to i is equal to the connection between i to j
In a Hopfield Network, what is form is the output?
What is the output in a Hopfield Network?
-either active +1 or inactive -1
-output OF EACH UNIT is weighted sum of the inputs pushed through a step function to generate +1 or -1
Do individual neurons interact with themselves in a Hopfield network?
non there are no self-connections
How are memories stored as in a Hopfield Network?
stored as low-energy states of the dynamics