6+7 Flashcards

1
Q

What is regression? How does this relate to classification?

A

Regression involves modelling a relationship between input variables and continuous output variables. Regression is more about approximating functions then classification, which relates to decision making.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is residual error?

A

Residual error is the deviation of the output given by the hypothesis from the “true” value,

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is mean squared error?

A

The average value of the squared error, this gives a measure of the deviation of the hypothesis output from the “true” output.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is a cost function? What about a cost surface?

A

A cost function is designed to minimise the cost(MSE) of a hypothesis. A cost surface is a function which describes a surface in 3 dimensions, for each possible w1 and w2 there is a value j

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is parameter search?

A

Find the minimum point of the cost surface, this is done using optimisation techniques. It involves making an initial guess of the parameter values can computing the cost, then update parameters and compute new cost, repeating the update and compute step.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is random walk?

A

Pick the weight update at random, keep the new parameters if the cost gets reduced. Otherwise discard and pick another random change set.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is steepest gradient* descent?

A

Weight update corresponds to negative gradient of cost function, this gives the direction that should be taken. This gradient tells how steep a point is, but not how far to travel.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is a polynomial hypothesis?

A

One with an input vector which can be to the power of something. The derivative of these can be easily taken to use for steepest gradient

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is a linear hypothesis?

A

One in which nothing is to the power of anything besides 1. The cost surface of this will be convex and the steepest gradient will be guaranteed to find the optimum solution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is a nonlinear hypothesis?

A

One which is not linear, this will have a non convex cost surface. Steepest gradient is also not guaranteed to find the optimum solution as there may be many local minima.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly