You are viewing a single comment's thread from:

RE: LeoThread 2025-02-16 20:43

in LeoFinance8 months ago

Part 4/8:

Shannon proposed that the information content associated with an event, denoted as H, is calculated as the logarithm of the inverse of the event's probability, expressed mathematically as H = log(1/P). Here, P represents the probability of the event occurring. For discrete variables, base-2 logarithms are typically used, measured in bits, while continuous variables utilize natural logarithms, leading to measurements in Nats.

Defining Information Content

Shannon's characterization of information content can also be conceptualized in terms of surprise. The less likely an event is to occur, the more surprising it is, thereby increasing its information content. Several criteria define this metric:

  1. Deterministic Outcomes: Events that are certain convey no information (H = 0).