7. Advanced Probability Concepts Flashcards

1
Q

What is joint probability?

A

The probability of two events occurring together: P(A ∩ B).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is marginal probability?

A

The probability of an event occurring regardless of other variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is a moment generating function?

A

A function that generates moments (mean, variance, etc.) of a probability distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the law of total expectation?

A

E[X] = E[E[X|Y]].

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is Chebyshev’s inequality?

A

A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is entropy in information theory?

A

A measure of uncertainty in a probability distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is mutual information?

A

A measure of dependence between two random variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the Kullback-Leibler (KL) divergence?

A

A measure of how one probability distribution diverges from another.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is Jensen’s inequality?

A

For a convex function g, E[g(X)] ≥ g(E[X]).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is covariance?

A

A measure of how two random variables vary together.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is joint probability?

A

The probability of two events occurring together: P(A ∩ B).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is marginal probability?

A

The probability of an event occurring regardless of other variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is a moment generating function?

A

A function that generates moments (mean, variance, etc.) of a probability distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the law of total expectation?

A

E[X] = E[E[X|Y]].

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is Chebyshev’s inequality?

A

A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is entropy in information theory?

A

A measure of uncertainty in a probability distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What is mutual information?

A

A measure of dependence between two random variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is the Kullback-Leibler (KL) divergence?

A

A measure of how one probability distribution diverges from another.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is Jensen’s inequality?

A

For a convex function g, E[g(X)] ≥ g(E[X]).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What is covariance?

A

A measure of how two random variables vary together.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What is joint probability?

A

The probability of two events occurring together: P(A ∩ B).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What is marginal probability?

A

The probability of an event occurring regardless of other variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

What is a moment generating function?

A

A function that generates moments (mean, variance, etc.) of a probability distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

What is the law of total expectation?

A

E[X] = E[E[X|Y]].

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
What is Chebyshev’s inequality?
A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.
26
What is entropy in information theory?
A measure of uncertainty in a probability distribution.
27
What is mutual information?
A measure of dependence between two random variables.
28
What is the Kullback-Leibler (KL) divergence?
A measure of how one probability distribution diverges from another.
29
What is Jensen’s inequality?
For a convex function g, E[g(X)] ≥ g(E[X]).
30
What is covariance?
A measure of how two random variables vary together.
31
What is joint probability?
The probability of two events occurring together: P(A ∩ B).
32
What is marginal probability?
The probability of an event occurring regardless of other variables.
33
What is a moment generating function?
A function that generates moments (mean, variance, etc.) of a probability distribution.
34
What is the law of total expectation?
E[X] = E[E[X|Y]].
35
What is Chebyshev’s inequality?
A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.
36
What is entropy in information theory?
A measure of uncertainty in a probability distribution.
37
What is mutual information?
A measure of dependence between two random variables.
38
What is the Kullback-Leibler (KL) divergence?
A measure of how one probability distribution diverges from another.
39
What is Jensen’s inequality?
For a convex function g, E[g(X)] ≥ g(E[X]).
40
What is covariance?
A measure of how two random variables vary together.
41
What is joint probability?
The probability of two events occurring together: P(A ∩ B).
42
What is marginal probability?
The probability of an event occurring regardless of other variables.
43
What is a moment generating function?
A function that generates moments (mean, variance, etc.) of a probability distribution.
44
What is the law of total expectation?
E[X] = E[E[X|Y]].
45
What is Chebyshev’s inequality?
A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.
46
What is entropy in information theory?
A measure of uncertainty in a probability distribution.
47
What is mutual information?
A measure of dependence between two random variables.
48
What is the Kullback-Leibler (KL) divergence?
A measure of how one probability distribution diverges from another.
49
What is Jensen’s inequality?
For a convex function g, E[g(X)] ≥ g(E[X]).
50
What is covariance?
A measure of how two random variables vary together.
51
What is joint probability?
The probability of two events occurring together: P(A ∩ B).
52
What is marginal probability?
The probability of an event occurring regardless of other variables.
53
What is a moment generating function?
A function that generates moments (mean, variance, etc.) of a probability distribution.
54
What is the law of total expectation?
E[X] = E[E[X|Y]].
55
What is Chebyshev’s inequality?
A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.
56
What is entropy in information theory?
A measure of uncertainty in a probability distribution.
57
What is mutual information?
A measure of dependence between two random variables.
58
What is the Kullback-Leibler (KL) divergence?
A measure of how one probability distribution diverges from another.
59
What is Jensen’s inequality?
For a convex function g, E[g(X)] ≥ g(E[X]).
60
What is covariance?
A measure of how two random variables vary together.
61
What is joint probability?
The probability of two events occurring together: P(A ∩ B).
62
What is marginal probability?
The probability of an event occurring regardless of other variables.
63
What is a moment generating function?
A function that generates moments (mean, variance, etc.) of a probability distribution.
64
What is the law of total expectation?
E[X] = E[E[X|Y]].
65
What is Chebyshev’s inequality?
A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.
66
What is entropy in information theory?
A measure of uncertainty in a probability distribution.
67
What is mutual information?
A measure of dependence between two random variables.
68
What is the Kullback-Leibler (KL) divergence?
A measure of how one probability distribution diverges from another.
69
What is Jensen’s inequality?
For a convex function g, E[g(X)] ≥ g(E[X]).
70
What is covariance?
A measure of how two random variables vary together.
71
What is joint probability?
The probability of two events occurring together: P(A ∩ B).
72
What is marginal probability?
The probability of an event occurring regardless of other variables.
73
What is a moment generating function?
A function that generates moments (mean, variance, etc.) of a probability distribution.
74
What is the law of total expectation?
E[X] = E[E[X|Y]].
75
What is Chebyshev’s inequality?
A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.
76
What is entropy in information theory?
A measure of uncertainty in a probability distribution.
77
What is mutual information?
A measure of dependence between two random variables.
78
What is the Kullback-Leibler (KL) divergence?
A measure of how one probability distribution diverges from another.
79
What is Jensen’s inequality?
For a convex function g, E[g(X)] ≥ g(E[X]).
80
What is covariance?
A measure of how two random variables vary together.
81
What is joint probability?
The probability of two events occurring together: P(A ∩ B).
82
What is marginal probability?
The probability of an event occurring regardless of other variables.
83
What is a moment generating function?
A function that generates moments (mean, variance, etc.) of a probability distribution.
84
What is the law of total expectation?
E[X] = E[E[X|Y]].
85
What is Chebyshev’s inequality?
A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.
86
What is entropy in information theory?
A measure of uncertainty in a probability distribution.
87
What is mutual information?
A measure of dependence between two random variables.
88
What is the Kullback-Leibler (KL) divergence?
A measure of how one probability distribution diverges from another.
89
What is Jensen’s inequality?
For a convex function g, E[g(X)] ≥ g(E[X]).
90
What is covariance?
A measure of how two random variables vary together.
91
What is joint probability?
The probability of two events occurring together: P(A ∩ B).
92
What is marginal probability?
The probability of an event occurring regardless of other variables.
93
What is a moment generating function?
A function that generates moments (mean, variance, etc.) of a probability distribution.
94
What is the law of total expectation?
E[X] = E[E[X|Y]].
95
What is Chebyshev’s inequality?
A probability bound stating that at least (1 - 1/k²) of values lie within k standard deviations of the mean.
96
What is entropy in information theory?
A measure of uncertainty in a probability distribution.
97
What is mutual information?
A measure of dependence between two random variables.
98
What is the Kullback-Leibler (KL) divergence?
A measure of how one probability distribution diverges from another.
99
What is Jensen’s inequality?
For a convex function g, E[g(X)] ≥ g(E[X]).
100
What is covariance?
A measure of how two random variables vary together.