Part 5/8:
Surprising Events: As the probability of an event decreases, its information content increases.
Additivity of Independent Events: The information content of two independent events should equal the sum of their individual information contents.
Shannon's framework adheres to these criteria, paving the way for analyzing various random variables through information theory.
Delving Deeper: Exploring Information Entropy
Beyond individual outcomes, the Shannon entropy becomes crucial as it embodies the expected information content of a random variable. This concept effectively measures the uncertainty or randomness among a variable's possible outcomes.