Wednesday, June 19, 2024

Entropy

If p is a discrete probability measure, then the Shannon entropy of p is H(p) =  − ∑xp({x})log p({x}). I’ve never had any intuitive feeling for Shannon entropy until I noticed the well-known fact that H(p) is the expected value of the logarithmic inaccuracy score of p by the lights of p. Since I’ve spent a long time thinking about inaccuracy scores, I now get some intuitions about entropy for free.

Entropy is a measure of the randomness of p. But now I am thinking that there are other measures: For any strictly proper inaccuracy scoring rule s, we can take Eps(p) to be some sort of a measure of the randomness of p. These won’t have the nice connections with information theory, though.

2 comments:

  1. The key property of the standard definition is that the entropy is additive over independent subsystems. I doubt that this will apply to the other approaches.

    ReplyDelete