Chapter 2_2 Information Theory Flashcards

1
Q

information theory

A

Information theory deals with uncertainty and the transfer or storage of quantified information in the form of bits.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

entropy

A

the quantified uncertainty in predicting the value of a random variable

how does one quantitatively measure the randomness of a random variable?

Roughly, the entropy of a random variable X, H(X), is a measure of expected number of bits needed to represent the outcome of an event x~X

H(X) = - ∑p(x) log p(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

H(X) = 0

H(X) = 1

A

the outcome is known

there is complete randomness in that all events are equally likely to happen.

the amount of randomness varies from 0 to 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

H(Y) < H(Z)

A

a random variable Y always takes a value of 1

a random variable Z is equally likely to take a value of 1, 2, or 3. So, in this case, H(Y) < H(Z), since the outcome of Y is much easier to predict than the outcome of Z

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

conditional entropy

A

Intuitively, if X is completely determined by Y, then H(X | Y) = 0 since once we know Y, there would be no uncertainty in X,
whereas if X and Y are independent, then H(X | Y) would be the same as the original entropy of X, i.e., H(X | Y) = H(X) since knowing Y does not help at all in resolving the uncertainty of X.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

mutual information

A

Another useful concept is mutual information defined on two random variables, I(X; Y), which is defined as the reduction of entropy of X due to knowledge about another random variable Y, i.e.,

How well did you know this?
1
Not at all
2
3
4
5
Perfectly