3.1 Sparsity:Applications and Development Collaborative filtering: Items ? ? ? ? ? ? ? ? ? ? ? ? ? Customers ? ? ? ? ? ? ? Customers are asked to rank items ? ? ? ? ? ? ? ? ? ? ? ? ? ? Not all customers ranked all items ? ? ? ? ? Predict the missing rankings ? ? ? ? ? ?
3.1 Sparsity: Applications and Development Collaborative filtering: Customers are asked to rank items Not all customers ranked all items Predict the missing rankings
3.1 Sparsity:Applications and Development Movies The Netflix prize: ? ? ? ? ? ? ? ? ? ? ? ? ? ? Users ? ? ? ? ? ? ? ? ? ? ? ? About a million users and ? ? ? ? ? ? ? ? ? ? 25000 movies ? Known rankings are sparsely distributed Predict unknown ratings
3.1 Sparsity: Applications and Development The Netflix prize: About a million users and 25000 movies Known rankings are sparsely distributed Predict unknown ratings
3.1 Sparsity:Applications and Development In 2006,monumental papers of compressive sensing were published: Emmanuel Candes,Justin Romberg,and Terence Tao,Robust uncertainty principles:Exact signal reconstruction from highly incomplete frequency information.(IEEE Trans.on Information Theory,52(2)pp.489-509,February 2006) David Donoho,Compressed sensing.(IEEE Trans.on Information Theory,52(4), pp.1289-1306,April2006) Emmanuel Candes and Terence Tao,Near optimal signal recovery from random projections:Universal encoding strategies?(IEEE Trans.on Donoho返a时etet目ghdeshao prize Information Theory,52(12),pp.5406-5425,December 2006)
3.1 Sparsity: Applications and Development In 2006, monumental papers of compressive sensing were published: Emmanuel Candès, Justin Romberg, and Terence Tao, Robust uncertainty principles: Exact signal reconstruction from highly incomplete frequency information. (IEEE Trans. on Information Theory, 52(2) pp. 489 - 509, February 2006) David Donoho, Compressed sensing. (IEEE Trans. on Information Theory, 52(4), pp. 1289 - 1306, April 2006) Emmanuel Candès and Terence Tao, Near optimal signal recovery from random projections: Universal encoding strategies? (IEEE Trans. on Information Theory, 52(12), pp. 5406 - 5425, December 2006) Donoho was awarded the Shao prize Emmanuel Candès Terrace Tao
3.2 Sparsity Rendering Algorithms The very important problem in compressive sensing is solving this problem: Given a sparse s,and do this compressiony=Φw,Φis a underdetermined matrix.Now the target is from y,to recover s Bad news is:is underdetermined and we know, normally,y =w has infinite solutions. Good news is:we have a prior information:w is sparse
3.2 Sparsity Rendering Algorithms The very important problem in compressive sensing is solving this problem: Given a sparse , and do this compression , is a underdetermined matrix. Now the target is from , to recover Bad news is: is underdetermined and we know, normally, has infinite solutions. Good news is: we have a prior information: is sparse 𝑦 𝛷 𝑤
3.2 Sparsity Rendering Algorithms Here are two concerns: 1:How sparse should w be so that it can be accurately recovered. 2:Is there any requisition fordΦ? For question 1,we know that y =w has infinite solutions,thus,we have to attach some conditions to s to this solution unique. As s is sparse,we should make it the sparsest solution for y =w. For question 2,we have the following lemma. Suppose a m x n matrix is such that every set of 2S columns are of are linearly independent.Then an S-sparse (the vector w has s non- zero elements)vector w can be reconstructed uniquely from y =w
3.2 Sparsity Rendering Algorithms Here are two concerns: 1: How sparse should be so that it can be accurately recovered. 2: Is there any requisition for ? For question 1, we know that has infinite solutions, thus, we have to attach some conditions to to this solution unique. As is sparse, we should make it the sparsest solution for . For question 2, we have the following lemma. Suppose a matrix is such that every set of 2S columns are of are linearly independent. Then an S-sparse (the vector has S nonzero elements) vector can be reconstructed uniquely from