After showing that the standard
Gibbs cross-entropy was flawed and tried to fix it with a also flawed initial formulation of
"free-of-logs" entropy, we faced the problem of finding a way to substitute a summary by a product without breaking anything important. Here we go...
When you define an entropy as a summary, each of the terms is supposed to be a "a little above zero": small and positive ๐ ≥0 so, when you add it to the entropy it can only slightly increase the entropy. Also, when you add a new probability term having (p=0) or (p=1), you need this new term to be 0 so it doesn't change the resulting entropy at all.
Conversely, when you want to define an entropy as a product of terms, they need to be "a little above 1" in the form (1+๐), and the terms associated with the extreme probabilities (p=0) and (p=1) can not change the resulting entropy, so they need to be exactly 1.
In the previous entropy this ๐ term was defined as (1-p
ipi), and now we need something like (1+๐) so why not just try with (2-p
ipi)?
Let us be naive again an propose the following formulas for entropy and cross-entropy:
H2(P) = ∏(2-pipi)
H2(Q|P) = ∏(2-qipi)
Once again it looks too easy to be worth researching, but once again I did, and it proved (well, my friend
Josรฉ Marรญa Amigรณ actually did) to be a perfectly defined generalised entropy of a really weird class, with
Hanel-Thurner exponents being (0, 0), something never seen in the literature.
As you can see, this new cross-entropy formula is perfectly well defined for any combination of p
i and q
i (in this context, we are assuming 0
0 = 1) and, if you graphically compare both cross-entropy terms, you find that, for
the Gibbs version, this term is unbounded (when q=0 the term value goes up to infinity):
 |
| ๐G(p, q) = -(p × log(q)) |
In the new multiplicative form of entropy,
this term is 'smoothed out' and nicely bounded between 1 and 2:
 |
| ๐2(p, q) = (2-qp) |