12. 540 Principles of the Global Positioning System Lecture 12 Prof. Thomas Herring 03/1802 12.540Lec12
03/18/02 12.540 Lec 12 1 12.540 Principles of the Global Positioning System Lecture 12 Prof. Thomas Herring
Estimation Summary Examine correlations Process noise White noise · Random walk First-order Gauss Markov Processes Kalman filters- Estimation in which the parameters to be estimated are changing with time 03/1802 12.540Lec12
03/18/02 12.540 Lec 12 2 Estimation • Summary – Examine correlations – Process noise • W hit e n ois e • Random walk • First-order Gauss Markov Processes – Kalman filters – Estimation in which the parameters to be estimated are changing with time
Correlations Statistical behavior in which random variables tend to behave in related fashions Correlations calculated from covariance matrix Specifically, the parameter estimates from an estimation are typically correlated Any correlated group of random variables can be expressed as a linear combination of uncorrelated random variables by finding the eigenvectors(inear combinations)and eigenvalues(variances of uncorrelated random variables) 03/1802 12.540Lec12
03/18/02 12.540 Lec 12 3 Correlations • Statistical behavior in which random variables tend to behave in related fashions • Correlations calculated from covariance matrix. Specifically, the parameter estimates from an estimation are typically correlated • Any correlated group of random variables can be expressed as a linear combination of uncorrelated random variables by finding the eigenvectors (linear combinations) and eigenvalues (variances of uncorrelated random variables)
Eigenvectors and Eigenvalues The eigenvectors and values of a square matrix satisfy the equation AXEnX If a is symmetric and positive definite(covariance matrix) then all the eigenvectors are orthogonal and all the eigenvalues are positive independent components made up of the into Any covariance matrix can be broken doy eigenvectors and variances given by eigenvalues One method of generating samples of any random process(ie, generate white noise samples with variances given by eigenvalues, and transform using a matrix made up of columns of eigenvectors 03/1802 12.540Lec12
03/18/02 12.540 Lec 12 4 Eigenvectors and Eigenvalues • The eigenvectors and values of a square matrix satisfy the equation Ax = λ x • If A is symmetric and positive definite (covariance matrix) then all the eigenvectors are orthogonal and all the eigenvalues are positive. • Any covariance matrix can be broken down into independent components made up of the eigenvectors and variances given by eigenvalues. One method of generating samples of any random process (ie., generate white noise samples with variances given by eigenvalues, and transform using a matrix made up of columns of eigenvectors
Error ellipses One special case is error ellipses. Normally coordinates(say North and East) are correlated and we find a linear combinations of north and east that are uncorrelated. Given their covariance matrix we have: Covariance matrIX; Eigenvalues satisfy(o+o)n+(ofdf-0n)=0 Eigenvectors 12 n-OF 2/ and 03/18/02 12.540Lec12
03/18/02 12.540 Lec 12 5 Error ellipses • One special case is error ellipses. Normally coordinates (say North and East) are correlated and we find a linear combinations of North and East that are uncorrelated. Given their covariance matrix we have: σn 2 σne σne σe 2 ⎡ ⎣ ⎢ ⎤ ⎦ ⎥ Covariance matrix; Eigenvalues satisfy λ 2 − (σn 2 + σe 2 ) λ + (σn 2 σe 2 − σne 2 ) = 0 Eigenvectors: σne λ1 − σn 2 ⎡ ⎣ ⎢ ⎤ ⎦ ⎥ and λ2 − σe 2 σne ⎡ ⎣ ⎢ ⎤ ⎦ ⎥