Multivariate Distributions Flashcards

1
Q

Joint distribution function

A

F_X,Y (x,y)

For vector( X) = (X,Y)

Double integral
∫[-∞ ,y] ∫[-∞ ,x] f_X,Y (x,y) .dx .dy
• f(x,y) bigger than or equal to 0 for all x,y in R
• double integral from -∞ to ∞ and -∞ to ∞
Of f(x,y).dx.dy =1

Probabilities:P((X,Y) in D) =
Double integral over region D of fX,Y(x,y) .dx.dy

D must be within original región

Beware discrete case not ≤

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Expectation of a multivatiate

E(g(X,Y))

A

E(g(X,Y))

Double integral from -∞ to ∞ and -∞ to ∞
Of
g(x,y)• f(x,y).dx.dy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Marginal PDF of joint distribution

A

Marginal PDF of x , integrate out y

fX(x) =
∫[-∞, ∞] fX,Y(x,y) .dy

Limits of y.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Conditional PDF of X given Y=y

A

f_ X/Y=y (x)

=

f_X,Y (x,y)
\
f_Y (y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Independence of (X,Y)

A
A pair  (X,Y) are independent 
IF AND ONLY IF

P[ X∈A, Y∈B] = P[ X∈A]•P[Y∈B].

For all A,B subset of R

Cov(X,Y) =0 doesn’t imply independence but if independent then cov 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Lemma 4.6 : independence for joint distribution

A

If fX(x) and fY(y). Are pdfs

Then for continuous cars

Independent if and only if

fX,Y(x,y = fx(x)•fY(y)

Ie test for independence if the joint distribution PDF = g(x)•h(y) separable prod of functions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Correlation coefficient

A

Correlation coefficient

Cov(X,Y) /. SQRT( Var(X)•Var(Y))

Cov(X,Y)
= E[(X- mu_x)(Y- mu_y)]
= E[XY] - E[X]E[Y]

E[XY] = double integral over region of xy •joint PDF

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Conditional expectation of X given Y=y

A

E[X\Y=y]

=

Integral from -infinity to infinity
Of
x•f_X\Y=y (x) .dx

Limits wrt x
Will be a function of y

Find conditional PDF from joint/marginal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Conditional variance of X\Y

Conditional covariance of X, Y/Z

A

Var(X\Y)

E[ (X-E[X\Y])^2. \Y]

Cov(X,Y\Z)
=. E[ XY\Z] - E[X\Z]•E[Y\Z]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Lemma 4.10 used to find mean and variance of X by conditioning

Eg if we know conditional distributions

A
1)
E[X] = E[E[X\Y]]
2)
Var(X) = E[Var(X\Y)] + Var [E[X\Y]]
3)

Cov(X,Y)
= E( Cov(X,Y\Z)) + Cov( E[X\Z], E[Y\Z])

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Transformations of bivariate distributions

(Multivariate£

A

Define transformations u=u(x,y) and v=v(x,y) to be continuous differentiable and one to one

(x to u and v)

U= u(X,Y) , V=v(X,Y)

If inverse exists x=x(u,v) and y= y(u,v)

Then joint PDF dist of transformed

f_U,V(u,v)

=

{f_XY(x(u,v),y(u,v)) • modulus of det (J). For all (u,v) in image of f
{ 0 o/w

J= matrix ( partial x wrt u. Partial x wrt v)
(Partial y wrt u partial y wrt v)

  • joint PDF of X,Y
  • transformation and inverse calc mod of J
  • region (u,v) corresponding to transformation for which joint PDF f_X,Y is bigger than 0

Sketch!!!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Transformations of bivariate tricks:

A

If only one transformed set V=X then integrate out x for marginal PDF

• may need to find joint PDF by independence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Students t distribution

A

Sample of normals with unknown parameters.
Sample mean and unbiased sample variance ( S^2 = (1/(n-1)) sum of (X_i - x bar)^2

  • (n-1)S^2 / sigma ^2 has chi squared dist with parameter n-1
  • (SQRT (n) / sigma )• (X bar - mu) has standard normal
  • z standard normal. W chi sqrd with n independent then

Z/ ( sqrtW/ SQRTn). ~ t_n dist

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Multivariate normal distribution

Expectation vector

Covariance matrix

A

E(X) = ( E(x_1),…,E(x_k))^T
= vector(mu) = (mu_1,…,mu_k)^T

Cov(X) = ( cov(X_1,X_1) cov(X_1,X_2,….) first row
Etc k by k matrix

Diagonals are variances σ_ii= σ^2_i

σ_ij = p_ij• σ_i • σ_j

P_ij correlation coefficent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Lemma 6.3 transformation of Multivariate normal

E[Y] cov(Y)

A

Affine trans

Y= AX +b

2x2 matrix A ,
2x1 b

E[Y] = AE[X] +b

Cov(Y) = ACov(X) A^T

Transpose!!!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

For two normals

X+Y

A

Var(X+Y)

= σ^2_X + 2Cov (X,Y) + σ^2_Y

17
Q

Properties of bivariate normal distribution lemmas

A

X= (X Y)^T

Mean vector= (μ_1, μ_2)^T

Covariance matrix Σ =
( σ₁₁ σ₁₂)
(σ₂₁ σ₂₂)

Lemma 6.7: the marginal distributions of vector(X) are
X₁~N( μ₁, σ₁₁) and X₂~ N( μ₂ , σ₂₂)

Lemma 6.8: two components of the bivariate normal are independent if and only if cov(X_1,X_2) =0 ie covariance matrix non diagonals are 0 (only bivariate normal!)

18
Q

Bivariate normal dist lemma 6.9 for conditional distribution

A

The conditional distribution given that is a normal distribution with mean and variance…. read

19
Q

Bivariate normal distribution:

For transformations

A

For non singular A , bivariate transformation Y has a normal distribution with mean vector Amu +b and cov A cov A^T as before.

20
Q

Multivariate normal distribution properties

A

Any subset of X_1,..,X_k marginal distribution are still Multivariate normals

Pair of components X_i,X_j indepdent if and only if Cov(X_1,X_2) =0

AX+b transform also normal

Conditional distributions are also Normal