Exam Notes Flashcards

(71 cards)

1
Q

Exchangeability

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Prior predictive density

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Posterior predictive density

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

De finetti rep theorem

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

reg expo family

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Conjugate prior for regular expo family

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Frequentist inference models (distributions)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Bayesian inference for normal models with know var

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Bayesian inference for normal with both shape and scale unknown

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Likelihood principle

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Define uninformative priors

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Improper prior

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Uninformative prior are not

A

Invariant under reparameterisation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Jeffrey’s prior is ___ and is not ___

A

Is invariant under reparam

Not upholding likelihood principle

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Define Bayesian point estimator

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Show that quadratic loss gives mean

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Show that solute loss gives median

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Show that 0-1 loss gives mode and when do we use?

A

Use for discrete, mode is maximum a posterior

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Define frequentis CI

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Define Bayesian CI

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Define quantile intervals

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Define HPD

A

Highest posterior density

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

MC integration for posterior mean, expectation of function of params and for posterior prob

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Define posterior consistency

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Doob’s theorem
26
Criticism of doob’s
27
TV distance of priors
28
MLE for Bayesian
29
Posterior asymptotic normality
30
Form g(theta) for Laplace approx
31
Use g to produce Laplace approximation
32
Logit of pi & inverse
33
Log likelihood for logit
34
Score function for logit
35
Fisher info of logit
36
Likelihood for finite mixture model
37
Intermediate distributions used in calculating posterior for finite mixture
38
39
Gibbs sampler on finite mixture model
40
Bayes factor
41
Expand calculation of comparison of H0 and H1 for Bayes factor
42
BIC objective
43
BIC method
44
LLN for MC estimation
45
Determining MC error
Following from LLN for MC
46
CLT for MC
47
Inversion method
48
Accept reject method
49
Efficient of rejection sampling
50
Optimal M for rejection sampling
51
Extend rejection sampling to normalised densities
52
Importance sampling
53
Optimal mood range density and recycling samples
54
UnNormalised densities and corresponding empirical distributions
55
Sampling importance re sampling
56
Sample depletion SIR
57
Notation for week 11
58
Define MC
59
For MC; initial distribution, transition kernel and marginal dist at step n
60
Stationarity and prove with detailed balance
61
Steps transition definitions
62
Set up ergodic theorem
63
Ergodic theorem
64
TV convergence Markov chasing (as)
65
TV convergence everywhere and multiple chains runs
66
CLT for MCMC
67
Setup MH algo
68
MH algo
69
Sample correlation and burn in
70
Transition kernel in MH
71
Asymptotic behaviour of MH