Week 8: The Perceptron - A Supervised Learning Algorithm: How does supervised learning is used to perform pattern recognition in perceptron? Flashcards
**Main Goal of this flashcards: **Learn to describe supervised learning is used to perform pattern recognition
in a perceptron
Rosenblatt’s Perception (2)
- Describes how a set of examples of stimuli and correct
responses can be used to train an artificial neural network
to respond correctly via changes in synaptic weights - Learning governed by the firing rates of the pre- and post-synaptic
neurons and the correct post-synaptic firing rate
(i.e.,a teaching signal) => “supervised learning”
Rosenblatt’s Perceptron is an
important historical example: instructive in
understanding the aim of using neural networks
for pattern classification.
What is a teaching signal?
Tell the network what the corrct output should be
The different types of learning rules (3)
- Unsupervised
- Supervised
- Reinforcement
What is unsupervised learning? (2)
There is no ‘teacher’ or feedback about right and wrong outputs
We just give pattern to the network and apply learning rule and see what happens
Examples of unsupervised learning (3)
- Hebbian learning
- Competitive learning rule
- BCM Rule
Supervsied learning is
providing a teaching signal
What is reinforcement learning?
Occasional reward or punishment (‘reinforcement learning’)
Reinforcement vs supervised learning (2)
In RL don’t have a teaching signal for every input-output combination as compared to SL
Only get occasional reward and punishment
Perceptron uses
standard artifical neurons with no dynamics
Simple Perception (4)
- One output neuron (O1) and two input neuron (x1 and x2)
- X1 and X2 neuron each have a input weight of w1 = 1 and w2 = 1
- To get output (activity of O1 neuron), you sum the input activity * input weight and put through trasnfer function (which is step function)
- Output neuron is active if both input neurons are active (given they reach a threshold of 1.5) = model performs logical ‘AND’ function
Diagram of Simple Perception
Diagram of Simple Perception Table = performs logical ‘AND’ function (3)
If both input neurons are not active then output neuron is not active
If one input neuron is active and other isn’t then output neuron is not
It is only when two input neurons are active that the output neuron is active
Rosenblatt’s produced graph of all possible x1 and x2 combinations from simple perceptron model
(4)
Dashed line is decision bounary
Left of line, 0 1 = 0
Right of line 01 = 1
We can write equation of line: x1 + x2 = 1.5, x2 = -x1 + 1.5 (rearranged first equation)
In Rosenblatt’s Perception graph, the line separating O=1 and 0=0 is where the net input equals the
threshold: w1x1 + w2x2 = T
In Roseblatt’s Perception we can rearrange equation for threshold (w1x1 +w2x2 = T) to make the equation of line: (3)
x2 = -(w1/w2)x1 + T/w2
(i.e., format of y = mx +c)
implcility assumed w1=w2=1
In Rosenblatt’s Perception, changing the weights will (2)
change the equation of line!
x2 = -(w1/w2)x1 + T/W2
Changing the weights is our ….. and …. learning in Rosenblatt’s Perception will (2)
mechanism to implement learning
update the position of the line/change decision boundary
In Rosenblat’’s Perception
learning, by changing the weights, to classify two groups of input patterns
means finding weights so that the line separates the groups
An input group is simply a set of
activity patterns across the input neurons
Pattern classification on Rosenblatt’s Perceptron : example that we want to classify adults and children based on height and weight (6)
- X1 signals weight
- X2 signals height
- Each individual (child/adult) we test will be one combination of two input neuron (weight/height) = pattern
- We train/learn the network on many example patterns of individuals, each individual will change input neuron’s weights and place a decision boundary to separate two groups (height and weigt)
- Decision boundary can be fuzzy since there may be kids tall or heavy and adults small and light but overall classifer separate children and adult based on height and weight
- After training this network on 100 individuals (50 adults/50 children), if we present 101’s person (that has not been used to train the network) , then we measure performance on how well it generalises (i.e., how well classifies the 101’s individual correctly)
Learning or training in perceptron model means
Present example patterns and each example pattern changes the input weight
What is the training set?
The set of patterns across your neurons (e.g., x1x2 representing weight and height)