3.4.1 Covariance Flashcards

(20 cards)

1
Q

What does covariance measure?

A

How two random variables change together — whether increases in one tend to correspond with increases or decreases in the other.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the formula for covariance?

A

Cov(X, Y) = E[(X - E[X]) * (Y - E[Y])]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the shortcut formula for covariance?

A

Cov(X, Y) = E[XY] - E[X] * E[Y]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What does positive covariance mean?

A

When X increases, Y tends to increase too.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does negative covariance mean?

A

When X increases, Y tends to decrease.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Can covariance be negative?

A

Yes — unlike variance, which is always non-negative.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is Cov(X, X)?

A

Var(X) — the covariance of a variable with itself is its variance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is Cov(c, X) or Cov(c₁, c₂)? (where c is a constant)

A

0 — constants have no variability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What happens when you factor a constant out of a covariance?

A

Cov(aX, bY) = ab * Cov(X, Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Do additive constants affect covariance?

A

No — Cov(X + c, Y) = Cov(X, Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Is covariance distributive?

A

Yes — Cov(X + Y, Z) = Cov(X, Z) + Cov(Y, Z)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the formula for Var(X + Y) when X and Y are not independent?

A

Var(X + Y) = Var(X) + Var(Y) + 2 * Cov(X, Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How do you compute covariance from a joint distribution?

A

Use: Cov(X, Y) = E[XY] - E[X] * E[Y], where each expectation is computed from the joint PMF or PDF.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Why might E[XY] be easier to compute than Cov(X, Y) directly?

A

Because E[XY] uses raw moments, not central moments — easier to compute from tables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

How many covariance terms do you need for 3 variables X, Y, Z?

A

3 terms: Cov(X,Y), Cov(X,Z), Cov(Y,Z)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What happens to Cov(X, Y) if X and Y are independent?

A

Cov(X, Y) = 0

17
Q

Does Cov(X, Y) = 0 imply X and Y are independent?

A

No — zero covariance implies uncorrelated, but not necessarily independent.

18
Q

What should you never assume on an exam?

A

Independence — unless explicitly stated or proven.

19
Q

If Cov(X, Y) = 0 but X and Y are not independent, what does that mean?

A

There is no linear relationship, but they might still be dependent in a nonlinear way.

20
Q

How can you show independence to prove Cov(X, Y) = 0?

A

Show f(x, y) = f_X(x) * f_Y(y) and that the ranges of X and Y do not restrict each other.