WK 8 - Neural Networks 1 (AI, The Brain and Neural Computing) Flashcards

1
Q

Brain Structure

A
  • Contains around 100 billion neurons
  • Neurons communicate through synapses – effectively a configurable chemical junction between neurons
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Neurons parts

A

Dendritic tree: receive signals

Cell body: process signals

Axon: transmit signals

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Neuronal Function

A

A neuron receives electrical activity from other neurons along its dendrites (inputs)

The axon (effectively the output of the neuron) will produce a pulse based on the strength of the incoming pulse

This is then passed to other neurons connected to this one

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Synapse

A

chemical junction which can be modified and therefore is thought to be where learning takes place

The synapse can release more neurotransmitter to enhance the coupling between cells

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Artificial Neuron

A

When two neurons fire together the connection between the neurons is strengthened
The activity of firing is one of the fundamental operations necessary for learning and memory

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Rosenblatt’s Perceptron Architecture

A

The Perceptron consists of a single layer of artificial neurons or “perceptrons.”

Each perceptron takes a set of input features and produces a binary output (0 or 1).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Rosenblatt’s Perceptron Training Data:

A

The training data for the Perceptron algorithm consists of labeled examples, where each example is
epresented by a set of input features and a corresponding target class label (0 or 1).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Rosenblatt’s Perceptron how it works:

A

the system was able to learn by means of weighted
connections

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Problems with the Rosenblatt’s Perceptron

A

The Perceptron algorithm can only learn and classify linearly separable data.

Binary Classification : The Perceptron algorithm is designed for binary classification tasks, where it assigns instances to one of two classes.

Perceptron could not correctly solve the
XOR function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Connectionism

A

Add a further layer of neurons to the network and create a Multi-Layer Perceptron to resolve the XOR issue of teh single layer perceptron.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Neural computing

A

Neural Computing is based on artificial neural networks (ANNs) that consist of interconnected nodes (neurons) and learn from data through training algorithms. ANNs are inspired by the structure and functioning of biological neural networks in the brain.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Traditional AI:

A

Traditional AI encompasses various techniques such as symbolic logic, rule-based systems, expert systems, and search algorithms. It focuses on explicit
epresentation of knowledge and logical reasoning.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Artificial Neural Networks (ANNs) learning :

A
  • Supervised Learning
  • Unsupervised Learning
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Supervised Learning

A

machine learning approach where:
- artificial neural networks are trained using labeled input-output
-The network then corrects itself based on that output adjusting its internal parameters (weights and biases) during training to minimize the difference between predicted outputs and actual outputs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Unsupervised Learning

A

-The network organises itself according to patterns in the data
-No external ‘desired output’ is provided

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Perceptron

A

Consists of a set of weighted connections, the neuron
(incorporating the activation function) and the output axon

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Modified Versions of Percepron Learning

A

The larning can be slowed down with a decimal term between 0 and 1 when the weight is updated.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Widrow-Hoff Learning Rule

A

– weight updates proportionate to
the error made

Δ = desired output – actual output
w_i (t+1) = w_i (t) + ηΔx_i(t)

19
Q

Limitations of the Perceptron

A
  • can only solve linearly separable
    problems ( cannot do it for XOR)
20
Q

Multi-Layer Perceptron

A

Single percepron limitation can be overcome by adding a further layer to the network.

Three layers
* Input
* Hidden
* Output

21
Q

Activation Functions

A
  • Sigmoid function:
    The steepness of the curve is changed by z
    The derivative can be easily
    computed
22
Q

Weights

A
  • are variable strength connections between
    units
  • propagate signals from one unit to the next
  • main component changed during learning
23
Q

FeedForward - supervised learning algorithm

A

type of neural network architecture where information flows in one direction, from the input layer to the output layer, without cycles or loops.

24
Q

Feed Forward output calculation

A

‒ Multiply incoming signal by weight
‒ Pass this through sigmoid activation function
‒ Pass on this output to units in the next layer

25
Q

Backpropagation ( updating rule )

A

Adapt the weights
Start from the output layer and work backwards

New weight (t+1) = old weight, plus a learning rateerror for pattern p on node joutput signal for p on j

26
Q

Two Types of Weight Updating

A

Batch Updating : All patterns are presented, errors are calculated, then the weights are updated

Online Updating : The weights are updated after the presentation of each pattern

27
Q

Momentum

A
  • Momentum encourages the network to make large changes to weights if the weight changes are currently large
  • This allows the network to avoid localminima in the early stages as it can overcome hills

weightupdatefunction + (w_ij (t) − w_ij (t −1))

28
Q

Symbolic AI

A
  • represent knowledge in the form of symbols, rules, and relationships.
  • manipulation and processing of symbolic representations of knowledge and logic.
  • Expert System (IF-THEN rules)
  • Enabling Reasoning
  • Knowledge Programmed (by Humans)
  • Serial (fragile)
  • Does Not Generalise (outside scope)
  • Understandable/Explainable
29
Q

Connectionism

A

Implicit Representation (numbers)
- Neural Network (weighted graph)
- Enabling Perception
- Knowledge Learned (from Data)
- Distributed (graceful degradation)
- Generalise (outside scope)
- Black-box

30
Q

Neural Network Properties

A

-Able to learn to relate input variables to required output
- Is able to generalise between samples
- Shows graceful degradation

31
Q

Classification

A

assigning input data to predefined categories or classes.

32
Q

Regression

A

involve predicting a continuous output value based on input variables.

33
Q

Graceful Degradation

A

symbolic systems: the removal of one component of the system results in failure

Removal of neuron(s) from a neural network: Reduce performance and probably not result in overall failure

34
Q

‘Generalisation’ in Symbolic AI

A

Extreamly difficult, Can operate as expert systems in constrained environments but will quickly fail if taken out of environment

35
Q

Generalisation in Neural Networks

A

Neural networks can learn common patterns in data therefore they can learn the distinctions between different classes of output

36
Q

Classification use

A

designed to group samples according to some known property.

– Minimum 2 datasets required – training and testing

37
Q

Data Representation Issues

A
  • Continuous data: Good data type for neural networks
  • Integer-type Data
    -Discrete Categories: Each value needs to have separate representation in the network
38
Q

categorical variables representations

A
  • Field-type
  • Thermometer-type
    -Missing Values
39
Q

Field-type:

A

Field-type representation is a method where each category is associated with a separate field or feature.

Each field represents a specific category, and the value of that field indicates the presence or absence of that category.

40
Q

Thermometer-type:

A

Thermometer-type representation is a method where each category is associated with a “thermometer” vector or bar.

41
Q

Thermometer-type vs Field

A

The difference between the two representations lies in the way categories are encoded.

The thermometer-type representation uses binary vectors to capture the degree of membership or association with each category,

while the field-type representation uses separate fields to indicate the presence or absence of each category.

42
Q

Missing Values

A
  • Occur frequently in real world data
  • Cannot be entered directly into the network
  • Requires some value in each row
43
Q

Overfitting

A

Overfitting occurs when we train the network on a task for too long

The network learns the noise in the input as well as the
common patterns

The result is poor performance on unseen examples

44
Q

Early Stopping (to avoid overfitting)

A
  • cross-validation

Have three sets, training, testing and cross-validation