16.322 Stochastic Estimation and Control, Fall 2004 Prof vander velde Lecture 13 x(t) t+T +2T R(r) (>7) e T =2na2(o)+2a2「 cos ordt =2nd26()+2(1-cos7o) SI 2Ia(o)+To2 To 2兀 mplitude of Sx falls off, but not very rapidly
16.322 Stochastic Estimation and Control, Fall 2004 Prof. Vander Velde Page 1 of 8 Lecture 13 Last time: ( ) ( ) 2 2 2 1 , ( ) , a xx a T R T a T τ σ τ τ τ ⎧ ⎛ ⎞ ⎪ +− ≤ ⎜ ⎟ = ⎨ ⎝ ⎠ ⎪ > ⎩ ( ) 2 2 2 2 0 2 2 2 2 2 2 () 1 2 ( ) 2 1 cos 2 2 ( ) 1 cos sin 2 2 () 2 T j j xx a T T a a a S ae d e d T a d T a T T T a T T ωτ ωτ τ ω τσ τ τ π δ ω σ ωτ τ σ π δω ω ω ω π δω σ ω ∞ − − −∞ − ⎛ ⎞ = +− ⎜ ⎟ ⎝ ⎠ ⎛ ⎞ = +− ⎜ ⎟ ⎝ ⎠ = +− ⎛ ⎞ ⎛ ⎞ ⎜ ⎟ ⎜ ⎟ ⎝ ⎠ = + ⎜ ⎟ ⎜ ⎟ ⎛ ⎞ ⎜ ⎟ ⎜ ⎟ ⎝ ⎠ ⎝ ⎠ ∫ ∫ ∫ Amplitude of xx S falls off, but not very rapidly
16.322 Stochastic Estimation and Control, Fall 2004 Prof vander Velde Use error between early and late indicator to lock onto signal. Error is a linear function of shift, within the range (-T,T) Return to the 1 st example process and take the case where the change points are Poisson distributed x(t) a=0 220 S(o= Take the limit of this as o and a become large in a particular relation: to establish the desired relation, replace A→kA S(o= kako k1 and take the limit as k→∞ lim Sr(o)=lim 2 k210a Note this is independent of frequency
16.322 Stochastic Estimation and Control, Fall 2004 Prof. Vander Velde Page 2 of 8 Use error between early and late indicator to lock onto signal. Error is a linear function of shift, within the range ( ,) −T T . Return to the 1st example process and take the case where the change points are Poisson distributed. 2 2 2 2 ( ) a xx S λσ ω ω λ = + Take the limit of this as 2 σ a and λ become large in a particular relation: to establish the desired relation, replace 2 2 2 2 2 2 ( ) ( ) a a a xx k k k k S k σ σ λ λ λ σ ω ω λ → → = + and take the limit as k → ∞ . ( ) 2 2 2 2 2 lim ( ) lim 2 a xx k k a k S k λσ ω λ σ λ →∞ →∞ = = Note this is independent of frequency
16.322 Stochastic Estimation and Control, Fall 2004 Prof vander velde S、(o) This is defined to be a white noise by analogy with white light, which is supposed to have equal participation by all wavelengths S、(o) Can shape x(o) to the correct spectrum so that it can be analyzed in this manner, by adding a shaping filter in the state-space formulation n(t) System white Filter Definition of a white noise process White means constant spectral density. S(o)=so,constant de =S0(z) White noise processes only have a defined power density. The variance of a white noise process is not defined If you start with almost any process x(o) lim√ax(an)→ is a white noise ge 3 of 8
16.322 Stochastic Estimation and Control, Fall 2004 Prof. Vander Velde Page 3 of 8 This is defined to be a “white noise” by analogy with white light, which is supposed to have equal participation by all wavelengths. Can shape x( )t to the correct spectrum so that it can be analyzed in this manner, by adding a shaping filter in the state-space formulation. Definition of a white noise process White means constant spectral density. 0 ( ) , constant xx S S ω = 0 0 1 ( ) 2 ( ) j Rxx Se d S τω τ ω π δ τ ∞ −∞ = = ∫ White noise processes only have a defined power density. The variance of a white noise process is not defined. If you start with almost any process x( )t , lim ( ) a a x at →∞ ⇒ is a white noise
16.322 Stochastic Estimation and Control, Fall 2004 Prof vander Velde o far we have looked only at the R_ and S of processes. We shall find that if we wish to determine only the Rw or S(thus the y )of outputs of linear systems, all we need to know about the inputs are their R r or Sxr But what if we wanted to know the probability that the error in a dynamic system would exceed some bound? For this we need the first probability density function of the system error-an output very difficult in general n(t) y(t) System The pdf of the output y(t) satisfies the Fokker-Planck partial differential equation-also called the Kolmagorov forward equation. Applies to a continuous dynamic system driven by a white noise process Gaussian process Linear process System One case is easy: Gaussian process into a linear system, output is Gaussian Gaussian processes are defined by the property that probability density functions of all order are normal functions n(xr 41, x2,42,;.x,, t,=n-dimensional normal f(x)= LMI M is the covariance matrix for x(2) x() Thus, fn (x) for all n is determined by M-the covariance matrix for x
16.322 Stochastic Estimation and Control, Fall 2004 Prof. Vander Velde Page 4 of 8 So far we have looked only at the Rxx and xx S of processes. We shall find that if we wish to determine only the Ryy or yy S (thus the 2 y ) of outputs of linear systems, all we need to know about the inputs are their Rxx or xx S . But what if we wanted to know the probability that the error in a dynamic system would exceed some bound? For this we need the first probability density function of the system error – an output. Very difficult in general. The pdf of the output y t( ) satisfies the Fokker-Planck partial differential equation – also called the Kolmagorov forward equation. Applies to a continuous dynamic system driven by a white noise process. One case is easy: Gaussian process into a linear system, output is Gaussian. Gaussian processes are defined by the property that probability density functions of all order are normal functions. 11 2 2 ( , ; , ;... , ) -dimensional normal n nn f xtxt xt n = 1 1 2 2 1 ( ) (2 ) TxM x n n fx e π M − − = M is the covariance matrix for 1 2 ( ) ( ) ( ) n x t x t x x t ⎡ ⎤ ⎢ ⎥ = ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎣ ⎦ M Thus, ( ) nf x for all n is determined by M - the covariance matrix for x
16.322 Stochastic Estimation and Control, Fall 2004 Prof vander Velde l=x(1)x( =R(11) Thus for a Gaussian process, the autocorrelation function completely defines all the statistical properties of the process since it defines the probability density functions of all order. This means IfR,(,t,)=R(-4), the process is stationary If two processes x(o), y(o)are jointly Gaussian, and are uncorrelated (R,(L, t, )=0), they are independent processes Most important: Gaussian input> linear system> Gaussian output. In this case all the statistical properties of the output are determined by the correlation function of the output- for which we shall require only the correlation functions for the inputs Upcoming lectures will not cover several sections that deal with Narrow band Gaussian processes Fast Fourier transform Pseudorandom binary coded signals These are important topics for your general knowledge Characteristics of Linear Systems Definition of linear system: If l1(D)→>y1(1) l2(1)→>y2() a4(1)+bn2(1)→ayv(1)+by2( Page 5 of 8
16.322 Stochastic Estimation and Control, Fall 2004 Prof. Vander Velde Page 5 of 8 ()( ) (, ) ij i j xx i j M xt xt R t t = = Thus for a Gaussian process, the autocorrelation function completely defines all the statistical properties of the process since it defines the probability density functions of all order. This means: If Ryx i j xx j i (, ) tt R t t = − ( ), the process is stationary. If two processes x( ), ( ) t yt are jointly Gaussian, and are uncorrelated ( ) (, ) 0 R tt xy i j = , they are independent processes. Most important: Gaussian input Æ linear system Æ Gaussian output. In this case all the statistical properties of the output are determined by the correlation function of the output – for which we shall require only the correlation functions for the inputs. Upcoming lectures will not cover several sections that deal with: Narrow band Gaussian processes Fast Fourier Transform Pseudorandom binary coded signals These are important topics for your general knowledge. Characteristics of Linear Systems Definition of linear system: If 1 1 2 2 () () () () ut yt ut yt → → Then 12 12 au t bu t ay t by t () () () () +→+