Entropy Flashcards
(5 cards)
What is entropy?
Layman: measure of disorder in a closed system
Information theory: a measure of the lack of structure or detail in the probability distribution describing your knowledge
What is maximum entropy in practice?
Finding the flattest possible probability distribution that is compatible with the constraints that arise from prior knowledge.
If given a distribution with lower entropy versus one with higher, what is the justification for choosing the one with higher entropy?
If both of these distributions are consistent your prior knowledge, you should prefer the one with higher information entropy because it makes fewer implicit unwarranted assumptions.
Systems also tend to develop towards maximum entropy.
Describe how information is determined in terms of bits?
A bit is a binary choice to get from one state to another, the more binary choices to get from one state to the other, the more bits it requires.
It takes more binary choices to get from uncertainty to a known state, and fewer binary choices to get from a partially known state to a known state.
What does the structure of a distribution with high entropy versus one with low entropy look like?
High entropy is flatter with less structure, low entropy is more structured and less smooth.