Properties of Entropy (1) Symmetry Hspn…,R)=H(Ra3…,Rk (2) Normalization Hs(1/2,1/2)=1 ) Expandability ●● )=Hs0,p,,…)=Hs、四P…,R (4)Determination H(1,0)=0 (5)Extremity Hs(p1,…,R)dN,…1)=ogN
Properties of Entropy (1) Symmetry H (p , …, p ) = H (p , …, p ) 1 N K(1) K(N) (2) Normalization H (1/2, 1/2) = 1 S S S (3) Expandability H (p , …, p ) = H (0, p , …, p ) = H (p , …, p , 0) S 1 N 1 N (4) Determination H (1, 0) = 0 S S 1 N (5) Extremity H (p , …, p ) H (1/N, …, 1/N) = log N S 1 N S
(6) Shannon Inequality P log p p log q (cOnditional Entropy p(xjy) log p(x y p(x )log p(x) ( 8)Addition H(q,…nq,…Bq1,…,PH) H(p1…,p)+H(q,…qy (9)Strong Addition (10) Partial Merging
(6) Shannon Inequality - p log p - p log q i i i i (7) Conditional Entropy - p(x , y ) log p(x |y ) - p(x ) log p(x ) i i i j i j i j i i i (8) Addition H (p q , …, p q , …, p q , …, p q ) = H (p , …, p ) + H (q , …, q ) 1 1 1 N M 1 M N 1 M 1 N (9) Strong Addition (10) Partial Merging
Improvements and extensions Wiener(1948): Negative Entropy Fadeev(1958): Loosing conditions Reny(1970): Incomplete entropy Reny(1960): -entropy Daroczy(1970): -entropy Arimoto(1971): -entropy Kolmogorov(1958): -entropy Posner(1967): entropy Kullback(1951): Information Variance Guiasu(1975): Weighted entropy
Improvements and Extensions -- Wiener (1948): Negative Entropy -- Fadeev (1958): Loosing conditions -- Reny (1970): Incomplete entropy -- Reny (1960): -entropy -- Daroczy (1970): -entropy -- Arimoto (1971): -entropy -- Kolmogorov (1958): -entropy -- Posner (1967): - -entropy -- Kullback (1951): Information Variance -- Guiasu (1975): Weighted entropy