Dependency Parsing Flashcards
What is a dependency grammar?
They are structures that are based on grammatical relations that have been defined by linguists.
What type of word order to dependency grammars have?
They have a free word order
What do the edges and nodes represent in a dependency grammar?
The labels for edges are grammatical relations and nodes are words
What do grammatical relations connect?
They connect head and dependent words
What is the type of a grammatical relation called?
Its grammatical function
What type of graph is a dependency tree?
It is a directed graph as the head node can only have arc coming into it
What is a dependency tree made up of?
Vertices = nodes = words or sometimes stems/affixes
Arcs = Grammatical function relationships
How many incoming arcs does the root node have?
None
How many arcs does each vertice have coming into it?
Just one
How many vertices can the root node reach through some path?
The root node has a path to every vertice
What does it mean if an arc is projective?
We say an arc is projective if for a head and all of its dependents, there is a path to connect them all directly.
In the image, is the arc connecting flight and was projective? Why?
The arc is not projective, because even though we can go from ‘flight’ to ‘was’, and from ‘was’ to ‘which’, there is no path to get from ‘flight’ to either ‘this’ or ‘morning’
In the image below, is the arc from ‘was’ to ‘late’ projective?
It is projective, as we can go from ‘was’ to ‘late’, and from ‘late’ we can go to ‘already’ directly
What do older parsing algorithms assume?
They assume that trees are always projective. English based Treebanks are guaranteed to be projective, but in other languages this is not the case and graphs can often include non-projective trees
What Treebanks are used that include dependency graphs?
The Penn Treebank and the Ontonotes dataset
How can we move from Parse-Structures (Constituency Grammars) to Dependency Structures?
We can identify head-dependent structures using head rules, and then we can connect children to heads using a dependency relation
What are some problems that arise from translating parse-structures to dependency structures?
We cannot represent non-projective structures (as there are no examples in phrase-structured treebanks)
There is a lack of structure in flat noun phrases
When would you prefer a dependency graph over a constituent based structure?
For training relational extraction models (clause argument detection) as clauses need subject and object grammar
What is transition based dependency parsing?
These are based on the classic shift-reduce parsing where the basic idea is that we have a CFG, a stack and a list of words. We encode the words in the input buffer, shift the words onto the stack, match the words on the stack against the grammar, replacing the pairs of words that match with the left hand side of the CFG.
E.g. We have a determiner and nominal on the stack and noun phrases are made up of det + nom, we replace them with a NP
What is the Arc Standard approach?
It is a simple and effective transition based dependency parsing approach
What is an Oracle?
it is a model that provides the correct transition operator (add left arc, add right arc, do a shift, finished) given a configuration state when doing the Arc Standard approach to dependency parsing
What happens when we perform a left arc?
We connect the 1st word of the stack (head) with the 2nd word (dep)
What happens when we perform a right arc?
We connect the 2nd word of the stack (dep) with the 1st word (head)
What happens when we perform a shift?
We move the next word in the input buffer into the stack