In information theory, the Rényi entropy is a measure (or family of measures) of the “suprise” or “information” contained in a random variable . It is defined as follows:
Where is a free parameter. The logarithm is usually base-2, but variations exist.
The case is known as the Hartley entropy or max-entropy, and quantifies the “surprise” of an event from , if is uniformly distributed:
Where is the cardinality of ; the number of different possible events. The most famous case, however, is . Since is problematic for , we must take the limit:
We then apply L’Hôpital’s rule to evaluate this limit, and use the fact that all sum to :
This quantity is the Shannon entropy, which is the most general measure of “surprise”:
Next, for , we get the collision entropy, which describes the surprise of two independent and identically distributed variables and yielding the same event:
Finally, in the limit , the largest probability dominates the sum, leading to the definition of the min-entropy , describing the surprise of the most likely event:
It is straightforward to convince yourself that these entropies are ordered in the following way:
In other words, from left to right, they go from permissive to conservative, roughly speaking.
- P.A. Bromiley, N.A. Thacker, E. Bouhova-Thacker, Shannon entropy, Rényi entropy, and information, 2010, University of Manchester.
- J.B. Brask, Quantum information: lecture notes, 2021, unpublished.