A Generalization of Markov's Inequality Theorem: For any X,for h:XR,for any t>0, E[h(X)] Pr[h(X)≥t]≤ t Chebyshev,Chernoff
A Generalization of Markov’s Inequality Theorem: For any X, for h : X ⇥ R+, for any t > 0, Pr[h(X) ⇥ t] E[h(X)] t . Chebyshev, Chernoff,
Chebyshev's Inequality Chebyshev's Inequality: For any t >0, ar[X] Pr[X-E[X]川≥t]≤ t2
Chebyshev’s Inequality Chebyshev’s Inequality: Pr[|X ⇥E[X]| ⌅ t] ⇤ Var[X] t 2 . For any t > 0
Variance Definition(variance); The variance of a random variable X is VAr[X]=E[(X-E[X)2]=E[x2]-(EX])2 The standard deviation of random variable X is 6[X]=VVar[X]
Variance Definition (variance): The variance of a random variable X is Var[X] = E (X E[X])2⇥ = E X2⇥ (E[X])2 . The standard deviation of random variable X is [X] = Var[X]
Covariance Theorem: Var[X+Y]=Var[X]+Var[Y]+2Cov(X,Y); n n Var Var[Xl+∑Cov(Xi,Xi). i=1 i计i Definition (covariance): The covariance of X and Y is Cov(X,Y)=E[(X-E[X])(Y-E[Y])]
Covariance Definition (covariance): The covariance of X and Y is Cov(X,Y ) = E[(X E[X])(Y E[Y ])]. Theorem: Var[X +Y ] = Var[X]+Var[Y ]+2Cov(X,Y ); Var ⇤ n i=1 Xi ⇥ = ⇤ n i=1 Var[Xi]+ ⇤ i=j Cov(Xi ,Xj)
Covariance Theorem: For independent X and Y,E[X.Y]=E[X].E[Y]. Theorem: For independent X and Y,Cov(X,Y)=0. Proof:Cov(X,Y)=E[(X-E[X])(Y-E[Y])] =E[X-EX☒]E[Y-E[Y] =0
Covariance Theorem: For independent X and Y , E[X ·Y ] = E[X]·E[Y ]. Theorem: For independent X and Y , Cov(X,Y ) = 0. Proof: Cov(X,Y ) = E[(X E[X])(Y E[Y ])] = E[X E[X]]E[Y E[Y ]] = 0