After showing that the standard
Gibbs crossentropy was flawed and tried to fix it with a also flawed initial formulation of
"freeoflogs" entropy, we faced the problem of finding a way to substitute a summary by a product without breaking anything important. Here we go...
When you define an entropy as a summary, each of the terms is supposed to be a "a little above zero": small and positive ๐ ≥0 so, when you add it to the entropy it can only slightly increase the entropy. Also, when you add a new probability term having (p=0) or (p=1), you need this new term to be 0 so it doesn't change the resulting entropy at all.
Conversely, when you want to define an entropy as a product of terms, they need to be "a little above 1" in the form (1+๐), and the terms associated with the extreme probabilities (p=0) and (p=1) can not change the resulting entropy, so they need to be exactly 1.
In the previous entropy this ๐ term was defined as (1p
_{i}^{pi}), and now we need something like (1+๐) so why not just try with (2p
_{i}^{pi})?
Let us be naive again an propose the following formulas for entropy and crossentropy:
H_{2}(P) = ∏(2p_{i}^{pi})
H_{2}(QP) = ∏(2q_{i}^{pi})
Once again it looks too easy to be worth researching, but once again I did, and it proved (well, my friend
Josรฉ Marรญa Amigรณ actually did) to be a perfectly defined generalised entropy of a really weird class, with
HanelThurner exponents being (0, 0), something never seen in the literature.
As you can see, this new crossentropy formula is perfectly well defined for any combination of p
_{i} and q
_{i} (in this context, we are assuming 0
^{0} = 1) and, if you graphically compare both crossentropy terms, you find that, for
the Gibbs version, this term is unbounded (when q=0 the term value goes up to infinity):

๐G(p, q) = (p × log(q)) 
In the new multiplicative form of entropy,
this term is 'smoothed out' and nicely bounded between 1 and 2:

๐2(p, q) = (2qp) 