Matrix Factorization and Latent Semantic Indexing Background Matrix vector multiplication Thus a matrix-vector multiplication such as Sx s, x as in the previous slide can be rewritten in terms of the eigenvalues/vectors. Sx=S(2v1+42+6v3) Sx=2Sv1+4S2+6S3=21v1+42+63 Sx=60v1+80v2+6v 3 Even though x is an arbitrary vector the action of s on x is determined by the eigenvalues/vectors
Matrix Factorization and Latent Semantic Indexing 6 Matrix vector multiplication ▪ Thus a matrix-vector multiplication such as Sx (S, x as in the previous slide) can be rewritten in terms of the eigenvalues/vectors: ▪ Even though x is an arbitrary vector, the action of S on x is determined by the eigenvalues/vectors. Sx = S(2v1 + 4v2 + 6v 3 ) Sx = 2Sv1 + 4Sv2 + 6Sv 3 = 21 v1 + 42 v2 + 6 3 v 3 Sx = 60v1 + 80v2 + 6v 3 Background
Matrix Factorization and Latent Semantic Indexing Background Matrix vector multiplication Suggestion: the effect of small"eigenvalues is small If we ignored the smallest eigenvalue(1), then instead of (60 we would get 60 80 80 6 These vectors are similar (in cosine similarity etc
Matrix Factorization and Latent Semantic Indexing 7 Matrix vector multiplication ▪ Suggestion: the effect of “small” eigenvalues is small. ▪ If we ignored the smallest eigenvalue (1), then instead of we would get ▪ These vectors are similar (in cosine similarity, etc.) 60 80 6 60 80 0 Background
Matrix Factorization and Latent Semantic Indexing Background Eigenvalues Eigenvectors For symmetric matrices, eigenvectors for distinct eigenvalues are orthogonal All eigenvalues of a real symmetric matrix are real All eigenvalues of a positive semidefinite matrix are non-negative 8
Matrix Factorization and Latent Semantic Indexing 8 Eigenvalues & Eigenvectors Sv{1,2}={1,2}v{1,2} ,and 121•v2=0 For symmetric matrices, eigenvectors for distinct eigenvalues are orthogonal All eigenvalues of a real symmetric matrix are real. n ,w TSw0, then ifSv=v0 All eigenvalues of a positive semidefinite matrix are non-negative Background
Matrix Factorization and Latent Semantic Indexing Background Example Let 2 Real, symmetric 2- Then s-n= 12-2 S-A|=(2-1)2-1=0 The eigenvalues are 1 and 3 (nonnegative, real The eigenvectors are orthogonal (and real Plug in these values and solve for eigenvectors
Matrix Factorization and Latent Semantic Indexing 9 Plug in these values and solve for eigenvectors. Example ▪ Let ▪ Then ▪ The eigenvalues are 1 and 3 (nonnegative, real). ▪ The eigenvectors are orthogonal (and real): = 1 2 2 1 S | | (2 ) 1 0. 1 2 2 1 2 − = − − = − − − = S I S I − 1 1 1 1 Real, symmetric. Background
Matrix Factorization and Latent Semantic Indexing Background Eigen/diagonal Decomposition etsEra be a square matrix with m linearly independent eigenvectors a"non-defective"matrix) Theorem: Exists an eigen decomposition Unique diagonal for S= UAU distinct (cf. matrix diagonalization theorem) eigen values Columns of u are eigenvectors of S Diagonal elements of a are eigenvalues of s A=diag(入1,…,Mm),A2≥入2+1
Matrix Factorization and Latent Semantic Indexing 10 ▪ Let be a square matrix with m linearly independent eigenvectors (a “non-defective” matrix) ▪ Theorem: Exists an eigen decomposition ▪ (cf. matrix diagonalization theorem) ▪ Columns of U are eigenvectors of S ▪ Diagonal elements of are eigenvalues of Eigen/diagonal Decomposition diagonal Unique for distinct eigenvalues Background