Information Theory 2 Flashcards
(14 cards)
If noise equals zero, what is the mutual information between X to-> Y equal to?
I(X,Y) =
if noise=0, then X=Y
thus
I(X,Y)=H(X)=H(Y)
mutual information =entropy of X = entropy of Y
If there is no noise between the information transfer between X to Y, what is the entropy at X and Y?
entropy of X is the same as the entropy at Y
H(X)=H(Y)
What is the equation for mutual information between points X to-> Y?
mutual info of X and Y = entropy of Y - noise entropy
I(X,Y) = H(Y) – H(Y|X)
Is the message identical at both points (X and Y) when there is noise?
no!
no noise = identical
Why is the message at points X and Y not identical when there is noise?
because information is lost when sending message from X to Y due to noise
How do you calculate the noise entropy of X->Y?
noise entropy = H(Y|X)
entropy of Y given X
With a message sent from X to-> Y, why is the noise H(Y|X)?
H(Y|X) means the amount of information received when X is held constant ( when no information sent in channel). thus this means the amount of information in the NOISE transmitted in the channel