Sunday, 10 June 2018

Graph Entopy 2: A first replacement

As I commented on a previous post, I found that there were cases where cross-entropy and KL-divergence were not well defined. Unluckily, in my theory those cases where the norm.

I had two options: Not even mentioning it, or try to go and fix it. I opted for the first as I had no idea of how to fix it, but I felt I was hiding a big issue with the theory under the carpet, so one random day I tried to find a fix.

Ideally, I thought, I would only need to replace the (pi × log(pi)) part with something like (pipi), but it was such a naive idea I almost gave up before having a look, but I did: how different do those two functions looks like when plotted on their domain interval (0, 1)?

Wow! They were just mirror images one of each other! In fact, you only need a small change to match them: (1-(pipi)):

It was even more similar after using a k=4/5:

So my first attempt to build a new entropy was this:

H1(P) = -k*𝚺(1-(pipi))

Surprisingly it proved to be a real generalised entropy with quite a standard Hanel-Thurner exponents of (1, 1), but it was not nicely separable, there was nothing near a 4th axiom replacement for this, so basically it was an interesting bullshit I wasted some weeks working on.

After trying to find the quadrature of the circle for some time, I realised I only reverted part of the logarithmic problem: the summary 𝚺 sign should had been replaced with the original multiplication one, ∏. The entropy I was looking for was a multiplication of terms, not a summary as usually expected.

But how can you build an entropy out of multiplications and still make sense? The first three axioms of entropy had never been applied to a multiplicative form to my knowledge, so it was an uncharted landscape to explore or, who knows, just a totally waste of time.

We will see if I wasted my time again in a next post. By now, just take my word on this: there was a way!


  1. Interesting, looking forward to the next post. Could be a more natural way of interpreting entropy.

    1. The next post is out!