Linear Probability Model Flashcards
(7 cards)
1
Q
What makes the Linear Probability Model different to a normal model
A
- The dependent variable becomes a dummy variable
- This means that it becomes a probability
2
Q
What is the random variable that takes values 0 and 1
A
- The bernoulli random variable
- Where y = 1 with probability p and y = 0 with probability (1-p)
3
Q
What is the expected value and variance of the Bernoulli
A
- E(y) = p
- Var(y) = p(1-p)
4
Q
How are b0 and bx interpreted in the Linear probability model
A
- b0 is the average probability of y
- b1 is the increase/decrease in probability for each unitary increase/decrease in x
5
Q
What is the main criticism of the LPM
A
- The fitted values from an OLS regression are never guaranteed to be between zero and one
- This means that probabilities may end up being greater than 1 or less than 0
6
Q
What is another criticism of the LPM
A
- The bernoulli random variable has heteroskedastic variance
- Linear regression requires homoscedastic variance
7
Q
A