WordNet Flashcards

(40 cards)

1
Q

Polysemous

A

having multiple meanings

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Word sense

A

A discrete representation of one aspect of the meaning of a word.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

thesaurus

A

Database that represents word senses.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Antonymy

A

When two words / lemmas have two opposite meanings.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

WSD (abbreviation)

A

word sense disambiguation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

word sense disambiguation

A

The task of determining which sense of a word is being used in a particular context.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How does WordNet represent different senses of the same word?

A

with superscript:
mouse1, mouse2 etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

embedding

A

a point in semantic space

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

glosses

A

textual definitions for each sense, in dictionaries or thesauruses.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Why can a gloss still be useful, even though it is often circular and relies on real-world knowledge that we as humans have?

A

It is just a sentence, and we can compute sentence embeddings that show something about the meaning of the sense.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Zeugma

A

A conjunction of different readings of a word, for example ‘Air France serves breakfast and the USA’.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Synonymy

A

When two senses of two different words / lemmas are identical.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Reversives

A

A group of antonyms that describe change or movement in opposite directions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Hyponymy

A

When a word/sense is more specific, denoting a subclass of another word/sense.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Hypernymy

A

When a word/sense describes a less specific or superclass of another word/sense.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Superordinate ==

A

Hypernym

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Other word for hypernym

A

Superordinate

18
Q

What does ‘A IS-A B’ mean?

A

A is an instance of B, or B contains A.

19
Q

Meronymy

A

A part-whole relation.

20
Q

Holynymy

A

A whole-part relation. Example: car is a holonym of wheel.

21
Q

Structured polysemy

A

When the senses of a word are related semantically.

22
Q

Metonymy

A

A type of polysemy relation. When the use of one aspect of a concept or entity is used to refer to other aspects of the entity or the entity itself.

23
Q

Synset

A

The set of near-synonyms for a WordNet sense.

24
Q

Supersenses

A

The lexicographic categories in which senses are grouped in WordNet.

25
Lexical sample tasks
Situations where we just need to disambiguate a small number of words.
26
All-words task
A problem in which we have to disambiguate all the words in some text.
27
Semantic concordance
A corpus in which each open-class word in each sentence is labeledwith its word sense from a specific dictionary or thesaurus.
28
Most frequent sense baseline
Choose the most frequent sense for each word from the senses in a labeled corpus. Can be quite accurate.
29
One sense per discourse baseline
Not generally used as a baseline, but holds better for coarse-grained senses.
30
How does the 1-nearest-neighbor WSD algorithm produce contextual sense embeddings (vs<\sub>) for sense _s_?
It averages the _n_ contextual representations vi for each of the _n_ tokens of the sense _s_.
31
How does the 1-nearest-neighbor WSD algorithm produce its output?
It chooses the sense whose embedding has the highest cosine similarity with the target word in context.
32
What class of WSD algorithms does not require labeled data?
knowledge-based algorithms
33
Lesk algorithm
A WSD algorithm that chooses the sense whose dictionary/gloss/definition shares the most words with the target word's neighborhood.
34
WiC (abbreviation)
word-in-context
35
word-in-context task
Given two sentences with a target word in different contexts, the system must decide whether the words are used in the same sense or in different senses.
36
Retrofitting / counterfitting
Methods that learn a second mapping after embeddings have been trained, shifting the embeddings such that synonyms are closer to each other and antonyms are further apart.
37
WSI (abbreviation)
word sense induction
38
Word sense induction
An unsupervised approach to WSD
39
Agglomerative clustering
Type of clustering where each of the _N_ training instances is initially assigned to its own cluster. New clusters are formed bottom-up by merging similar clusters.
40