Properties of entropy (1) Symmetry HsPr…R)=s(p;3…,k (2 )Normalization Hs(12,12)=1 (3)Expandability Hs(,…)=H0,p,…,)=Hs甲P…,P0 (4) Determination H(1,0)=0 (5)Extremity H、(p1,…,R)(IN,…,1N)=logN
Properties of Entropy (1) Symmetry H (p , …, p ) = H (p , …, p ) 1 N K(1) K(N) (2) Normalization H (1/2, 1/2) = 1 S S S (3) Expandability H (p , …, p ) = H (0, p , …, p ) = H (p , …, p , 0) S 1 N 1 N (4) Determination H (1, 0) = 0 S S 1 N (5) Extremity H (p , …, p ) H (1/N, …, 1/N) = log N S 1 N S
(6) Shannon Inequality p log p p log q (7 Conditional entropy p(xiy log p(x p(x)log p(X) (8)Addition H(Bq1,…,马q,…,Rn1,…PA) =H(P1…,P+H(q1,…,qy ()Strong Addition (10)Partial merging
(6) Shannon Inequality - p log p - p log q i i i i (7) Conditional Entropy - p(x , y ) log p(x |y ) - p(x ) log p(x ) i i i j i j i j i i i (8) Addition H (p q , …, p q , …, p q , …, p q ) = H (p , …, p ) + H (q , …, q ) 1 1 1 N M 1 M N 1 M 1 N (9) Strong Addition (10) Partial Merging
Improvements and extensions wiener(1948): Negative Entropy Fadeev(1958): Loosing conditions Reny(1970): Incomplete entropy Reny(1960): -entropy Daroczy(1970): -entropy Arimoto(1971): -entropy Kolmogorov(1958): -entropy Posner(1967): entropy Kullback(1951): Information Variance Guiasu(1975): Weighted entropy
Improvements and Extensions -- Wiener (1948): Negative Entropy -- Fadeev (1958): Loosing conditions -- Reny (1970): Incomplete entropy -- Reny (1960): -entropy -- Daroczy (1970): -entropy -- Arimoto (1971): -entropy -- Kolmogorov (1958): -entropy -- Posner (1967): - -entropy -- Kullback (1951): Information Variance -- Guiasu (1975): Weighted entropy