Communication Chapter 3 Flashcards
(35 cards)
Symmetric p₁₀=p₀₁ =
p₁₀ = p₀₁ = p
Symmetric p₀₀ = p₁₁ =
p₀₀ = p₁₁ = 1-p
noisy channel probabilities satisfy
p₀₁+p₀₀=1=p₁₀ + p₁₁
crossover probability
the probability p which is the probability that 0 is flipped to 1 which is equal to the probability that 1 is flipped to a 0.
if p = 0 then
the channel is noiseless
if p = 1/2 then
the channel is useless (it is too noisy to decipher any of the messages)
What is the probability that precisely i errors occur with crossover probability p
(n choose i) p ^ i (1-p)^ (n-i)
i.e. the binomial distrubution
Channel Encoding map
f:A->{0,1}ⁿ
Channel decoding maps
g: {0,1}ⁿ->A such that g(f(x))=x for every x∈A.
h: {0,1}ⁿ->A such that h(y)=y for all y∈C.
length of a code
we say a code C has length n is all its codewords have length n
If a code has length n then the code is automatically…
it is automatically prefix free
A
the alphabet which we are encoding (or decoding to)
probability of wrong decoding (in words)
Pₑᵣᵣ = maximum of the sum over all y∈{0,1}ⁿ of the probability y is received given c is sent
Transmission rate formula
R(C)=log|C|/n
The higher the transmission rate…
the more efficiently the codewords are sent
Transmission rate is always less than or equal to
`1
|A|=
|C|
How to calculate the smallest code length you can achieve for a code C of size |C|
log|C|
i.e. there are 2ˣ strings of length x
if C is and [n-k] code then R(C)=
R(C)=k/n
How to check a decoding is valid
check g(f(c))=c
Hamming Distance
The Hamming Distance denoted dₕ(x,y) between x=x₁,..,xₙ∈{0,1}ⁿ and y=y₁,..,yₙ∈{0,1}ⁿ is the number of places where the two strings differ
dₕ(x,y) = |{i∈[n]: xᵢ≠yᵢ}|
metric axioms
- Positive
- Symmetric
- triangle inequality
The bigger the minimum distance…
The further codewords are away from each other which means its less likely to have errors
minimum distance decoding informal
A decoding is a minimum distance decoding if no matter what string you take x is denoted as h(x) and h(x) is the closest codeword to x