Bayesian Statistics Flashcards
Bayes’ theorem
P(A|B) = P(B|A) * P(A) / P(B)
For bayesian inference, P(θ|X), define θ and X.
θ is the parameter (what we want to infer from the data, the hypothesis)
X is the data
hence evaluating the probability of a hypothesis given some data.
define each part of
P(θ|X) ∝ P(X|θ) * P(θ)
posterior ∝ likelihood * prior
prior = initial belief
updated by the likelihood, also l(θ|X)
posterior is our updated belief AFTER taking the data into account
updating posterior with multiple sets of data:
P(θ|X, Y, Z) ∝
P(θ|X, Y, Z) ∝ P(X|θ) * P(Y|θ) * P(Z|θ) * P(θ)
ie. product of individual likelihoods and the prior
what does it mean to describe a prior or a likelihood as “normal”?
normal prior: θ is known with some certainty. express as θ0, the believed value, with variance.
normal likelihood function: model uncertainty in a measurement x for the quantity θ
what is “precision” 𝜏 ?
how to find precision of the posterior?
defined as the reciprocal of variance.
precision of posterior, 𝜏1 = 𝜏0 + 𝜏
where 𝜏0 is precision of prior, and 𝜏 is precision of likelihood.