Week 1 Flashcards
What is gradient descent?
An algorithm for minimizing functions so for example cost function for linear regression
What does a:=b mean?
Set a to the value of b
What does alpha represent in gradient descent algorithm?
The learning rate, how big a step it takes when updating parameters Øj
In gradient descent algorithm what does derivative signify?
The slope of the tangent - can be positive or negative
What is a convex function
Cost function for linear regression is always a bowl shaped or convex function. No local optima, only global optimum
What is batch gradient descent?
Each step of gradient descent uses all the training examples (other types of gradient descent only use subset of training examples)
What is linear regression algorithm notation?
hø(x) = ø0 + ø1x
What is univariate linear regression?
Linear regression with one variable
What is cost function algorithm notation?
J(Øo, Ø1) = 1/2m mEi=1 (hØ(xi) - yi)squared
What is another name for cost error function?
Squared error function