Word vectors representword meaningvisualizationneedhelpcomeqo0.286take0.792keepqive-0.177getmakemeetcontinue-0.107seeexpect=0.109wantbecomeexpect-0.542thinkremainsay0.349areis0.271beweraas0.487beingbeenhadhashave
Word vectors represent word meaning- visualization 0.286 0.792 −0.177 −0.107 0.109 −0.542 0.349 0.271 0.487 expect =
Word vectorsrepresentword meaning-visualizationfoodeatlaptopDistributional vectorsrepresentedby a D-dimensionalvector where D<<V,where V is size of VocabularyThegreatest contribution of distributed representation is tomakerelatedorsimilarwordscloserindistanceand solve the problem of curseof dimensiontoa certain extent
Word vectors represent word meaning- visualization The greatest contribution of distributed representation is to make related or similar words closer in distance and solve the problem of curse of dimension to a certain extent
Word vectorsrepresentword meaning-visualization著名的类比King-Man+Woman=QueenMale-FemaleThegreatest contributionof distributed representationisto makerelatedorsimilarwordscloserindistanceandsolvetheproblemofcurseofdimensiontoacertainextent
Word vectors represent word meaning- visualization The greatest contribution of distributed representation is to make related or similar words closer in distance and solve the problem of curse of dimension to a certain extent
Distributed and Distributional RepresentationNotes:Distributed representation refers to the form of textrepresentation,whichislowdimensionaland densecontinuousvectorDistributional Representationis akind of methodto obtaintextrepresentation,whichuses co-occurrence matrixto obtainthesemanticrepresentationofwords.Eachlineof co-occurrencematrixcanbe regarded as the vector representation ofcorrespondingwords逸大
Distributed and Distributional Representation Distributed representation refers to the form of text representation, which is low dimensional and dense continuous vector Distributional Representation is a kind of method to obtain text representation, which uses co-occurrence matrix to obtain the semantic representation of words. Each line of co-occurrence matrix can be regarded as the vector representation of corresponding words. Notes:
Word2vecWord2vecTomasMikolov.etal.2013交通大学1.MikolovT,ChenK,Corrado G,etal.Efficient estimation ofword representations in vectorspace[J].arXivpreprintarXiv:1301.3781,2013
Word2vec • Tomáš Mikolov, et al. 2013 Word2vec 1. Mikolov T, Chen K, Corrado G, et al. Efficient estimation of word representations in vector space[J]. arXiv preprint arXiv:1301.3781, 2013