Lecture 13 & Workshop 3: Logistic Regression with Regularisation Flashcards

1
Q

The model is built based on the notion underlying y=1..
e.g. y=1 means tumor is malignant
y = 0 means tumor is benign.
=> model is trying to predict if tumor is ?.

A

The model is built based on the notion underlying y=1..
e.g. y=1 means tumor is malignant
y = 0 means tumor is benign.
=> model is trying to predict if tumor is malignant.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Logistic regression is a special case of linear regression where y = {??}

A

Logistic regression is a special case of linear regression where y = {0, 1}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Model for logistic regression:
h(x) = g(??) using logit function g

h(x) can be interpreted as the estimated ? that ?? on input x.

A

Model for logistic regression:
h(x) = g(theta^T * x) using logit function g

h(x) can be interpreted as the estimated probability that y=1 on input x.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Logit/ sigmoid function :

g (theta^T * x ) = 1 / [1 + ??]

A

Logit/ sigmoid function :

g (theta^T * x = 1 / [1 + e^(-theta^T *x)]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

to have a ? cost / best fitted line of h(x):
if y=1 => we want h(x) is close to ?
if y=0 => we want h(x) is close to ?.

A

to have a low cost / best fitted line of h(x):
if y=1 => we want h(x) is close to 1
if y=0 => we want h(x) is close to 0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Logistic Cost function:
when y=1:
J(theta) = (-1/?) * Sum [??? ]

A

Logistic Cost function:
when y=1:
J(theta) = (-1/m) * Sum [y * log ( h(x) ) ]
m: number of observations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Logistic Cost function when y = 0:

J (theta) = (-1/m) * Sum [???]

A

Logistic Cost function when y = 0:
J (theta) = (-1/m) * Sum [log ( 1 - h(x) )]
m: number of observations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly