L16 - Word Embedding Flashcards

1
Q

What issue does Word Embedding solve?

A

Topic Modellings inability to work effectively on shorter texts.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Give a high level explanation of the concept of Word Embedding…

A

Converts words into numeric vectors. Similar words can then be represented by the same numeric value within the Word Embedding.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Give the step by step process of how it works…

A
  1. Apply one hot vector transformation to all words in the corpus.
  2. Input vectors into a 2D neural-network. Input and output vectors will alwars represent 1 word has a complexity of 2V + N (V is input vecotrs, N is hidden layer)
  3. Input vectors are fed to activation function along with their associated weight.
  4. Each input vector now corresponds with a value in Word Embedding (middle vector). Similar words are given similar or same value. Thus, next word can be predicted.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are 2 adaptations to improve the Word Embedding model?

A

Continuous Bag of Words Model - As opposed to a single vector for each input, use a set of vectors (set of words) to establish context for a more accurate output.

Skip-gram model - Enables the prediction of more than 1 word.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the goal of Word Embedding?

A

We want to establish a low dimension Word Embedding (weight vector) which enables the prediction of next words.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are the 2 main properties of Word Embeddings?

A
  • Numerical Vectors - Represent words as numerical vectors.
  • Vector Space Relationships - In the mathematical vector space, closeness of vectors represent the similarities of the words the vectors represent.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the main issue of Word Embedding? What is the solution to this?

A

Polysemy - Some words can be viable for multiple embeddings due to multiple meanings depending on the context. For example, Queen could be royalty, music and cards. If the text contains all these themes, no clear relationship can be established between the word and document.

Solution : Sense Embedding

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How does Sense Embedding solve the Polysemy issue of Word Embedding?

A

Construct embeddings for each sense that the word can be used in.

Feed the model with many labeled examples in which the word is used.
The model can then derive more detailed context of when the word is used and it’s associated data via the appropriate embedding.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What does sentence / document embedding mean?

A

The closer proximity of sentence or document vectors implies that they are more similar.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are some applications of Word Embedding?

A

Automatic translation
Text summarisation
Clustering similar texts

How well did you know this?
1
Not at all
2
3
4
5
Perfectly