Week 1 Flashcards

1
Q

What is gradient descent?

A

An algorithm for minimizing functions so for example cost function for linear regression

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does a:=b mean?

A

Set a to the value of b

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does alpha represent in gradient descent algorithm?

A

The learning rate, how big a step it takes when updating parameters Øj

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

In gradient descent algorithm what does derivative signify?

A

The slope of the tangent - can be positive or negative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is a convex function

A

Cost function for linear regression is always a bowl shaped or convex function. No local optima, only global optimum

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is batch gradient descent?

A

Each step of gradient descent uses all the training examples (other types of gradient descent only use subset of training examples)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is linear regression algorithm notation?

A

hø(x) = ø0 + ø1x

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is univariate linear regression?

A

Linear regression with one variable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is cost function algorithm notation?

A

J(Øo, Ø1) = 1/2m mEi=1 (hØ(xi) - yi)squared

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is another name for cost error function?

A

Squared error function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly