Self Organising Map Flashcards

1
Q

Define SOM and its purpose

A

An unsupervised learning algorithm used to map high dimensional data to low dimensions whilst retaining topological properties.

Used for data visualisation, data mining, speech analysis etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the 2 assumptions that SOM works on?

A
  • Data with similar features belong to the same class.
  • SOM can identify features in data as well as organise the data in a lower dimension.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How many weights does each node in the output map have?

A

1 for every input node.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Define the competitive process…

A

Nodes in the output map compete with one another to be most similar to the input patter. This is calculated via euclidean distance.

The output node with the lowest ED is selected as the BMU.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Define the cooperation process…

A

Once the BMU has been established, it updated the weights of itself and its neighbours.

BMU moves closer to the input pattern. Neighbours also move closer, but to a lesser extent.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are the steps of SOM?

A
  1. Initialisation of all weights and choose parameter values (neighbourhood size, learning rate)
  2. Select a random input patters and feed it into network.
  3. Iterate all nodes in output map, conducting competition to establish the BMU.
  4. Update BMU and neighbouring weights.
  5. Repeat from step 2 until convergence.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are the 2 parameters in SOM?

A

Neighbourhood size

Learning rate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How should the learning rate change as the learning progresses

A

The learning rate should start large and decrease as learning progresses to prevent overshooting the optimal convergence.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly