In the standard Gibbs-Shannon entropy, the 4th Shannon-Khinchin axiom about separability says 2 different things (that we will label here as sub-axioms 4.1 and 4.2 respectively) that, given two independent distributions P and Q, the entropy of the combined distribution PxQ is:
Axiom 4.1) H(PxQ) = H(P) + H(Q)
When P and Q are not independent, this formula becomes an inequality:
Axiom 4.2) H(PxQ) ≤ H(P) + H(Q)
Graph entropy, being applied to graphs instead of distributions, allows for some more forms of combining two distributions, giving not one but at least three intersting inequalities:
If you calculate the ratios between 1st and 2nd figures (R1), between 2nd and 3rd (R2) and finally between 3rd and 4rd (R3), then average some thousands of them for random distributions of 2, 3, 4, 5, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100 and 200 items and plot them, this is the interesting result:
Well, that was all about graph entropies I had for now, in future posts I may talk about some practical usage of the new entropy... or not!
Note: graph entropy, being constrined by those four nice innequalities, doesn't actually account for neither of the 2 sub-axioms. Is there any -generalized- entropy meeting at least 4.1? Yes! There was one and only one, the Tsallis generalized entropy does!
In the meanwhile, you can check the slides I have prepared here.
No comments:
Post a Comment