西安交通大学Natural languageprocessingwith deeplearningXIANHAOTONGUNIVERSITYLanguage Model&Distributed Representation (2)交通大学ChenLicli@xjtu.edu.cn2023
Chen Li cli@xjtu.edu.cn 2023 Language Model & Distributed Representation (2) Natural language processing with deep learning
Outlines1. Word2Vector (W2V)2. Training LM3. Evaluating LM
Outlines 1. Word2Vector (W2V) 2. Training LM 3. Evaluating LM
Outlines1.Word2Vector(W2V2. Training LM3. Evaluating LM
Outlines 1. Word2Vector (W2V) 2. Training LM 3. Evaluating LM
WordEmbeddingConstruct a dense vector to represent each word, making this vector similartothevectorofwordsinsimilarcontexts0.2860.792-0.177banking:-0.1070.109-0.5420.3490.271
Word Embedding • Construct a dense vector to represent each word, making this vector similar to the vector of words in similar contexts. 0.286 0.792 −0.177 −0.107 0.109 −0.542 0.349 0.271 banking =
WordEmbeddingConstruct a dense vectorto representeach word,makingthis vector similartothevectorofwordsinsimilarcontexts0.2860.792-0.177banking:-0.1070.109-0.5420.3490.271Word vectorsaresometimes calledword embeddings orword representationsThese are distributed representations
Word Embedding • Construct a dense vector to represent each word, making this vector similar to the vector of words in similar contexts. 0.286 0.792 −0.177 −0.107 0.109 −0.542 0.349 0.271 banking = Word vectors are sometimes called word embeddings or word representations. These are distributed representations