Matrix Factorization and Latent Semantic Indexing Background Diagonal decomposition: why/how Let U have the eigenvectors as columns: UHy...y Then, su can be written Thus su=UA, or U-lSU=4 And s=uaU-
Matrix Factorization and Latent Semantic Indexing 11 Diagonal decomposition: why/how U= v vn ... Let U have the eigenvectors as columns: 1 = = = n SUSv vn v n vn v vn ...... ...... 1 1 11 1 Then, SU can be written And S=UU–1. Thus SU=U, or U–1SU= Background
Matrix Factorization and Latent Semantic Indexing Background Diagonal decomposition -example Recall The eigenvectors and form左 Inverting, we have VE Recall UU-1=1. Then, S=UAU7
Matrix Factorization and Latent Semantic Indexing 12 Diagonal decomposition - example Recall ; 1, 3. 12 21 1 =2 = S= The eigenvectors and form − 1 1 1 1 − = 1 1 1 1 U Inverting, we have − − = 1/2 1/2 1 1/2 1/2 U Then, S=UU–1 = − − 1/21/2 1/21/2 03 10 11 11 Recall UU–1 =I. Background
Matrix Factorization and Latent Semantic Indexing Background Example continued Let's divide U(and multiply U-1)by 2 Then, s= Q (Q=Q) Why? Stay tuned
Matrix Factorization and Latent Semantic Indexing 13 Example continued Let’s divide U (and multiply U–1) by 2 − − 1/21/2 1/21/2 03 10 1/21/2 Then, S= 1/21/2 Q (Q-1= QT ) Why? Stay tuned … Background
Matrix Factorization and Latent Semantic Indexing Background Symmetric eigen decomposition If ErA is a symmetric matrix Theorem: There exists a(unique) eigen decomposition EM where Q is orthogonal: Q=Q Columns of Q are normalized eigenvectors Columns are orthogonal (everything is real)
Matrix Factorization and Latent Semantic Indexing 14 ▪ If is a symmetric matrix: ▪ Theorem: There exists a (unique) eigen decomposition ▪ where Q is orthogonal: ▪ Q-1= QT ▪ Columns of Q are normalized eigenvectors ▪ Columns are orthogonal. ▪ (everything is real) Symmetric Eigen Decomposition S=QQ T Background
Matrix Factorization and Latent Semantic Indexing Time out came to this class to learn about web search and mining, not have my linear algebra past dredged up agaIn But if you want to dredge, Strangs Applied Mathematics is a good place to start What do these matrices have to do with text? Recall M XN term-document matrices But everything so far needs square matrices-so
Matrix Factorization and Latent Semantic Indexing 15 Time out! ▪ I came to this class to learn about web search and mining, not have my linear algebra past dredged up again … ▪ But if you want to dredge, Strang’s Applied Mathematics is a good place to start. ▪ What do these matrices have to do with text? ▪ Recall M N term-document matrices … ▪ But everything so far needs square matrices – so …