I had two options: Not even mentioning it, or try to go and fix it. I opted for the first as I had no idea of how to fix it, but I felt I was hiding a big issue with the theory under the carpet, so one random day I tried to find a fix.
Ideally, I thought, I would only need to replace the (pi × log(pi)) part with something like (pipi), but it was such a naive idea I almost gave up before having a look, but I did: how different do those two functions looks like when plotted on their domain interval (0, 1)?
Wow! They were just mirror images one of each other! In fact, you only need a small change to match them: (1-(pipi)):
So my first attempt to build a new entropy was this:
H1(P) = -k*𝚺(1-(pipi))
Surprisingly it proved to be a real generalised entropy with quite a standard Hanel-Thurner exponents of (1, 1), but it was not nicely separable, there was nothing near a 4th axiom replacement for this, so basically it was an interesting bullshit I wasted some weeks working on.
After trying to find the quadrature of the circle for some time, I realised I only reverted part of the logarithmic problem: the summary 𝚺 sign should had been replaced with the original multiplication one, ∏. The entropy I was looking for was a multiplication of terms, not a summary as usually expected.
But how can you build an entropy out of multiplications and still make sense? The first three axioms of entropy had never been applied to a multiplicative form to my knowledge, so it was an uncharted landscape to explore or, who knows, just a totally waste of time.
Interesting, looking forward to the next post. Could be a more natural way of interpreting entropy.
ReplyDeleteThe next post is out!
Deletehttps://entropicai.blogspot.com/2018/06/graph-entropy-3-changing-rules.html