5: Learning Flashcards
What is NETTalk?
???
What is a neural network?
???
What is the PDP model?
Parallel Distributed Processing ???
What is symbolic AI?
???
How do neural networks differ from symbolic AI?
???
What are some advantages of symbolic-based AI systems?
– A symbolic algorithm can execute anything expressed as following a
sequence of formal rules.
– Large amounts of memorised information can be copied and retrieved
accurately ad infinitum.
– Information processing is relatively fast and highly accurate.
What are some disadvantages of symbolic-based AI systems?
– Maybe not everything can be feasibly expressed as following a sequence of
formal rules. The Chinese Room, various solution searches, meaning.
– Symbolic retrieval of memories can be brittle in being all-or-none.
– Many Real World situations are novel and so require adaptation rather than
fast pre-set actions. Example: every-day situations.
Of symbolic and neural network AI systems, which is most similar to the organisation of the brain? How? Comment on the duplicity of neuron organisation.
Neural networks. They are modelled on the organisation of neurons in the brain and allow for parallel rather than serial organisation. The brain has much simpler and slower individual processing units than computers yet its computation in many areas is better, suggesting the organisation of the brain is better.
What constitutes a neural network?
A collection of interconnected neurons (or units). Some receive environmental input and some of the others give output to the environment.
What are hidden units? What are they aka?
Neurons/units in neural networks that have neither input nor output connections.
How are neurons modelled artificially in neural networks?
Binary threshold unit: compute excitation as the weighted sum of inputs, and if excitation is above a certain threshold then consider the neuron “excited” and is activated. When activated, the neuron is in the active state so will output 1 rather than 0.
What is the formula for calculating the output of an artificial neuron (BTU)?
Outj = g(Σ w(ij) in(i) - Θ); g(x) = 1 where x > 0; g (x) = 0 where x <= 0
g(x) is the activation function, here being a step function (“stepping” at 0)
Θ is the threshold
j is the jth threshold unit (with a unique Θ)
w(ij) is the weight of the ith input to the jth threshold unit
in(i) is the ith input to the jth threshold unit
What is an activation function?
A normalising function that defines the output of a neuron given the calculated activation from the inputs to the neuron and their weights as part of a threshold unit.
Name and describe 3 activation functions.
- Step function
- output 1 once activation reaches certain number, 0 otherwise - Sigmoid
- calculate output as part of sigmoid curve
- g(x) = 1/1+exp(-x) - Rectified Linear Unit
- output has threshold activation as with step function, then increasing linearly for further increases in activation
- e.g. with threshold of 0:
when x <= 0, g(x) = 0
when x > 0, g(x) = x
What is Feedforward Architecture?
???
What is supervised learning?
???
What is recurrent architecture?
???
What are network layers in neural networks?
???
What is the difference between lateral and feedforward connections?
???
For a feedforward-based neural network of n layers, how many are hidden?
n - 2. Since you can “see” the input and output layers, and all others only connect to each other or input and output layers, so are hidden.
What is Strictly Layered Architecture?
A neural network system in which there are no lateral connections and each neuron may only connect to others in adjacent layers.
What does it mean for a network to be “fully connected”?
Each neuron is connected to all others it is able to be connected to; which other neurons each neuron can be connected to is limited by the architecture of the network.
What is the concept of Feedforward Pass?
The way in which input patterns go through layers in feedforward networks in series - i.e. layer-by-layer, whereas within each layer the signal is propagated in parallel to all neurons in the layer simultaneously (from the previous layer or input).
What is the concept of generalisation?
???