Week 9 Flashcards

1
Q

Motivate ensemble methods

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Idea of ensemble methods

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is Bagging

A

Bootstrap aggregating

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is boosting?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Bagging - process

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Goal of boosting

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Boosting procedure

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

AdaBoost

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

AdaBoost advantages/disadvantages

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

AdaBoost algo

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Stacked generalisation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Parallel structure for ensemble classifiers

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Serial structure for ensemble classifiers

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Hierarchical structure for ensemble classifiers

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Random forest

A

(Random forest is ensemble of BAGGED decision trees, with randomised feature selection)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Training decision tree

17
Q

Notes on decision trees

A

Decisions at nodes could be more complex
Prediction performance can be poor (tends to overfit)
Unstable (small changes to training set cause large changes in classification accuracy. although bagging improves stability)

18
Q

Constructing binary decision tree algorithm

19
Q

Randomised tree learning procedures algo

20
Q

Advantages of random forests

A

Fast (scalable)
Accurate
Simple to implement
Popular