Wednesday 18 July 2018

Graph entropy 6: Separability

This is 6th post on a serie, so you should had read the previous posts (and in the correct order) first. If you just landed here, please start your reading here and follow the links to the next post until you arrive here again, this time with the appropiated backgroud.

In the standard Gibbs-Shannon entropy, the 4th Shannon-Khinchin axiom about separability says 2 different things (that we will label here as sub-axioms 4.1 and 4.2 respectively) that, given two independent distributions P and Q, the entropy of the combined distribution PxQ is:

Axiom 4.1) H(PxQ) = H(P) + H(Q)

When P and Q are not independent, this formula becomes an inequality:
Axiom 4.2) H(PxQ) ≤ H(P) + H(Q)

Graph entropy, being applied to graphs instead of distributions, allows for some more forms of combining two distributions, giving not one but at least three intersting inequalities:



I have not even tried to prove any of them, and it doesn't look easy to me, but I have made several millions of tests with pairs of random distributions and no counterexample was found so far, so it is just a personal conjeture that the three innequalities can be proven right.

If you calculate the ratios between 1st and 2nd figures (R1), between 2nd and 3rd (R2) and finally between 3rd and 4rd (R3), then average some thousands of them for random distributions of 2, 3, 4, 5, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100 and 200 items and plot them, this is the interesting result:


Well, that was all about graph entropies I had for now, in future posts I may talk about some practical usage of the new entropy... or not!

Note: graph entropy, being constrined by those four nice innequalities, doesn't actually account for neither of the 2 sub-axioms. Is there any -generalized- entropy meeting at least 4.1? Yes! There was one and only one, the Tsallis generalized entropy does!

In the meanwhile, you can check the slides I have prepared here.

No comments:

Post a Comment