Linear Probability Model Flashcards

(7 cards)

1
Q

What makes the Linear Probability Model different to a normal model

A
  • The dependent variable becomes a dummy variable
  • This means that it becomes a probability
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the random variable that takes values 0 and 1

A
  • The bernoulli random variable
  • Where y = 1 with probability p and y = 0 with probability (1-p)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the expected value and variance of the Bernoulli

A
  • E(y) = p
  • Var(y) = p(1-p)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How are b0 and bx interpreted in the Linear probability model

A
  • b0 is the average probability of y
  • b1 is the increase/decrease in probability for each unitary increase/decrease in x
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the main criticism of the LPM

A
  • The fitted values from an OLS regression are never guaranteed to be between zero and one
  • This means that probabilities may end up being greater than 1 or less than 0
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is another criticism of the LPM

A
  • The bernoulli random variable has heteroskedastic variance
  • Linear regression requires homoscedastic variance
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly