Compression Models Flashcards

1
Q

Entropy compression model ( Huffman coding )

A

Generates codewords based on the distribution of inputs.

Optimal in terms of compression rate.
Primary tests are required for learning the statistics of the data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Static compression model ( MORSE ASCII)

A

Same compression model for all input symbols and the compressed codewords have all the same length for different input symbols.

Fast but not optimal because symbols have statistical probability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Adaptive compression model (LWZ)

A

Progressively learns and updates the model of incoming text.

Highly efficient despite the lack of statistics of incoming data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Huffman coding

A
  • Sort symbols by probability
  • Start from 2 symbols with smallest probability and join them into a new symbol.
  • assign 0 to one of the symbols and 1 to the other

Terminate when the sum of the two probabilities reaches 100

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Huffman encoding (dynamic)

A

Advantages:
The code is uniquely decodable and prefix free

Used in many applications ( compression jpeg)

Limitations:
Need to know probabilities of the symbols
Error propagation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Lzw (adaptive)

A

Encoding and decoding complexity
Nlog(n)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Why not use entropy for compression of symbols with similar probability?

A

Because the associated code words would have the same length so the compression rate would be low.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Entropy

A

Is the average amount of bits of any uniquely decodable source needs to be grater or equal than the entropy.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Compression rate

A

Len uncompressed/ Len compressed compare with entropy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly