Lecture 7: Markov Chain Monte Carlo Flashcards

1
Q

Why do we often not estimate the ‘marginal likelihood‘ / marginal probability of the data p(X ) when estimating posterior distributions?

A

The posterior distribution shape is given by the product of the prior p(θ) and the likelihood L(θ|X ) ≡p(X |θ)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does the term ‘Monte Carlo’ refer to?

A

It gets thrown around and sometimes just refers to a simulation, but it usually refers to algorithms in which you are simulating to solve a numerical problem (one fixed answer)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Describe the analogy of a strange dartboard in regards to monte carlo integration

A
  • Say we want to know the area of some strange dartboard
  • We can throw darts at this strange dartboard uniformly
  • If we know the range of the uniform distribution (a square around the dart board):
  • We can take the proportion of darts inside the dartboard (an estimate of probability)
  • Then multiply the estimated probability by the area of the square
  • This gives an approximate area of the strange dartboard
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are Monte Carlo methods used to obtain (for this course)

A

Posterior samples

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Name a type of Monte Carlo algorithm designed to obtain posterior samples in this manner

A

Rejection sampling

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Describe how rejection sampling works using the metaphor of the dartboard

A
  • Say our strange dartboard is the scaled shape of a posterior distribution f (θ) = p(θ)L(θ|X )
  • We can throw darts at this strange dartboard uniformly in two dimensions
  • We can use the x-axis of non-rejected (accepted ) darts as samples from our posterior distribution in one dimension

-Thus we can scale a Uniform distribution with PDF g(θ) by a scalar M , obtain a sample θs from the Uniform distribution, and accept with probability:
f(θs) / M*g(θs)

This algorithm will the draw samples from the posterior distribution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly