Jumat, 22 Juni 2018

Sponsored Links

Persian Language Resources Based on Dependency Grammar - ppt download
src: slideplayer.com

Grammatical dependence ( DG ) is a class of modern grammar theory that is all based on a dependency relationship (as opposed to a constituent relationship) and which can be traced primarily to Lucien TesniÃÆ'¨ re. Dependence is the idea that linguistic units, eg. words, connected to each other with directional links. Verbs (finite) are taken into the structural center of the clause structure. All other syntactic units (words) are directly or indirectly connected to the verb in the case of a direct link, called dependency . DGs are different from the phrase grammar (constituent grammar), because the DG does not have a phrasal node, even if they recognize the phrase. The structure is determined by the relationship between the word (head) and the dependent. Dependence structures are flatter than partial constituent structures because they lack the constituent of limited verb phrases, and they are particularly suitable for language analysis in free word order, such as Czech, Slovak, Turkish, and Warlpiri.


Video Dependency grammar



Histori

The idea of ​​dependencies between grammatical units has existed since the earliest recorded grammar, eg. P ?? this, and the concept of dependence therefore may precede the idea of ​​constituents by centuries. Ibn Ma ???, a twelfth-century linguist from CÃÆ'³rdoba, Andalusia, was probably the first grammarian to use the term dependence in the sense of grammar we use today. In early modern times, the concept of dependence seems to have co-existed side by side with constituent concepts, which have later entered Latin, French, English, and other grammars from extensive studies of the logic of antiquity. Addiction is also concretely present in the works of SÃÆ'¡muel Brassai (1800-1897), a Hungarian, and Heimann Hariton Tiktin (1850-1936), a Romanian linguist.

However, the grammar of modern dependence begins mainly from the work of Lucien TesniÃÆ'¨re. TesniÃÆ'¨re is a Frenchman, a polyglot, and a professor of linguistics at universities in Strasbourg and Montpellier. His major work was published posthumously in 1959 - he died in 1954. The basic approach to the syntax he developed appears to have been seized independently by others in 1960 and the number of other dependency-dependent grammars has become famous since the early works. DG has generated much interest in Germany both in theoretical syntax and language pedagogy. In recent years, the major developments surrounding dependency-based theory have come from computational linguistics and partly because of David Hays's influential work in machine translation at RAND Corporation in the 1950s and 1960s. Dependency-based systems are increasingly being used to break down natural languages ​​and generate tree banks. Interest in dependent grammar is evolving today, an international conference on linguistic dependence being a relatively new development (Depling 2011, Depling 2013, Depling 2015).

Maps Dependency grammar



Dependency vs. constituency

Dependence is a one-to-one correspondence: for every element (eg word or morph) in a sentence, there is exactly one node in the sentence structure corresponding to that element. The result of this one-to-one correspondence is that the grammar dependency is the grammar of the word (or morph). All that exists is the elements and dependencies that connect the elements into the structure. This situation should be compared to the constituent relation of grammatical phrase structure. The constituency is a one-to-one-or-more correspondence, which means that, for every element in a sentence, there is one or more nodes in the structure corresponding to that element. The result of this difference is that the minimal dependency structure is compared with constituent constituent structures, as they tend to contain fewer nodes.

These trees illustrate two possible ways to create dependence and constituent relationships (see below). This dependency tree is a "regular" tree, that is, it reflects the actual sequence of words. Many trees of abstract dependence move away from the linear sequence and focus only on the hierarchical order, which means they do not show the actual sequence of words. This constituent tree follows the convention of an empty phrase structure (BPS), in which the words themselves are used as node labels.

The difference between dependency-dependent grammars and constituents comes largely from the initial division of clauses. The constituent relationship is derived from the initial binary division, where clause is divided into subject noun phrase (NP) and predicate verb phrase (VP). This division is certainly present in the basic analysis of clauses that we find in works, for example, Leonard Bloomfield and Noam Chomsky. However, TesniÃÆ'¨re, firmly opposed to this binary division, prefers to place the verb as the root of all clause structures. The attitude of TesniÃÆ'¨re is that the division of subject-predicate comes from the term logic and has no place in linguistics. The significance of this difference is that if one recognizes the division of the predicate-preliminary subject in the syntax as real, then one is likely to go to the path of constituent grammar, whereas if one rejects this division, then the only alternative is to position the verb as the root of all structures, which means one has chosen the path of grammatical dependencies.

Persian Language Resources Based on Dependency Grammar - ppt download
src: slideplayer.com


Grammar dependency

The following framework is based on dependency:

  • Algebraic syntax
  • Operator grammar
  • Link grammar
  • Generative functional description
  • Lexicase
  • Mean text theory
  • The grammar of the word
  • Expandable dependency grammar

The link grammar is based on dependency relation, but link grammar does not include directionality in the dependence between words, and thus does not depict a relationship that depends on the head. Hybrid dependence/grammar constituents use dependencies between words, but also include the dependence between the phrasal nodes - see for example, Arab dependence in the Quran, Treebank. The derivation trees of adjacent grammars are dependent on dependence, although the full TAG tree is a constituent based, so in this case, it is not clear whether TAG should be viewed more as a constituent dependence or grammar.

There is a big difference between the grammar that has just been registered. In this case, the dependency relationship is compatible with other major principles of grammar. So like constituent grammar, grammar dependencies can be mono- or multislial, representational or derivational, construction- or rule-based.

Unsupervised Dependency Parsing David Mareček Institute of Formal ...
src: images.slideplayer.com


Represents the dependencies

There are various conventions used by DG to represent dependencies. The following schemes (other than the tree above and the trees below) illustrate some of these conventions:

Representation in (a-d) is a tree, in which the specific conventions used in each tree vary. The solid lines are the edge of dependence and the colored dashed line is the projection line . The only difference between tree (a) and tree (b) is that tree (a) uses category classes to label vertices while tree (b) uses the words themselves as node labels. The tree (c) is a tree that is subtracted along the string of words below and the projection line is considered unnecessary and hence omitted. The tree (d) abstracts away from the linear sequence and only reflects the hierarchical order. The arrow arrow at (e) is an alternative convention used to indicate dependencies and is favored by Word Grammar. The parentheses in (f) are rarely used, but still sufficiently able to reflect the dependency hierarchy; dependents appear within the brackets over their heads. And finally, the curve as in (g) is another convention that is sometimes used to denote the hierarchy of words. Dependents are placed under their heads and thrust. Like tree (d), indentation in (g) abstract from linear sequence.

The purpose of this convention is that they are just that, that is the convention. They do not affect the basic commitment to dependence as the relationship that groups syntactic units.

Introducing probabilistic information in Constraint Grammar ...
src: images.slideplayer.com


Dependency type

The dependence representation above (and further below) shows a syntactic dependency. Indeed, most of the work in grammatical dependencies focuses on syntactic dependency. Syntactic dependencies, however, are only one of three or four types of dependencies. Text-meaning theory, for example, emphasizes the role of semantic and morphological dependencies in addition to syntactic dependencies. The fourth type, prosodic dependency, can also be recognized. Distinguishing between these types of dependencies can be important, in part because if a person fails to do so, the possibility of semantic, morphological, and/or prosodic dependencies will be misinterpreted because of their large syntax dependencies. The following four subsections briefly sketch each of these types of dependencies. During the discussion, the existence of syntactic dependencies is taken for granted and used as an orientation point to establish the properties of three other types of dependence.

Semantic dependencies

Semantic dependencies are understood in terms of their predicates and arguments. The semantic predicate argument depends on the predicate. Often, semantic dependencies overlap with and point in the same direction as syntactic dependencies. Sometimes, however, semantic dependencies may indicate the opposite direction of syntactic dependencies, or they can be completely independent of syntactic dependencies. The hierarchy of words in the following example shows a standard syntax dependence, whereas arrows indicate semantic dependencies:

The two arguments of Sam and Sally in tree (a) depend on the predicate likes , in which this argument is also syntactically dependent on likes . What this means is that semantic dependencies and syntax overlap and point in the same direction (down the tree). Attributive adjectives, however, are predicates that take their head noun as their argument, then big is a predicate in the tree (b) that takes bone as its one argument; semantic dependence points to a tree and therefore goes against syntactic dependence. The same situation is obtained in (c), where the predicate predicate in takes two arguments image and wall ; one of these semantic dependencies points to a syntactic hierarchy, while the other points down. Finally, the predicate to help in (d) takes a Jim argument but is not directly linked to Jim in the syntax hierarchy, which means that semantic dependency is entirely depending on syntactic dependencies.

morphological dependencies

Morphological dependence obtains between words or parts of words. When a certain word or part of a word affects another word, the last word is morphologically dependent on the previous word. Agreement and harmony are manifestations of morphological dependence. Like semantic dependencies, morphological dependencies can overlap with and point in the same direction as syntactic dependencies, overlap with and point in the opposite direction of syntactic dependencies, or completely independent of syntactic dependencies. Arrows are now used to indicate morphological dependence.

Plural homes in (a) demand the plural form of the demonstrative determinant, then this appears, not this , which means there is a morphological dependence which indicates hierarchy from home to this . The situation is reversed in (b), where a single subject of Sam demands a suffix -s affirmation in finite verb function , which means there is a morphological dependence which directs the hierarchy of Sam to works . The type of determiner in the German example (c) and (d) affects the inflexional suffixes that appear on the adjective alt . When unlimited ein articles are used, strong masculine ends -er appear on adjectives. When a definite article is der is used, on the contrary, the weak suffix -e appears on the adjective. Thus since the determiner's choice affects the morphological form of the adjective, there is a morphological dependency which points from the determiner to the adjective, where the morphological dependence is entirely independent of the syntactic dependencies. Consider further the following French sentence:

The masculine subject of le chien in (a) demands the masculine form of the predicative blanc , while the feminine subject la maison demands this feminine adjective form. Morphological dependencies that are entirely independent of syntactic dependencies are therefore pointed again throughout the syntactic hierarchy.

Morphological dependence plays an important role in typological studies. Languages ​​are classified as mostly head markers ( Sam work-s ) or mostly sign-dependent ( these houses ), where most if not all languages ​​contain at least some the small size of the two heads and the mark depends.

Prosodic dependencies

Proodic dependency is recognized to accommodate clitic behavior. Clitic is a syntactically autonomous element that is prosodically dependent upon the host. Therefore, clitic is integrated into the prosody of its host, meaning that it forms a word with its host. Proodic dependence exists entirely in linear dimensions (horizontal dimension), whereas standard syntactic dependencies exist in hierarchical dimensions (vertical dimension). Classical examples of clitics in English are additional helpers (eg -ll , -s , -ve ) and possessive markers -s The prosodic dependency in the following example is indicated by hyphens and no vertical projection line:

Dashes and lack of projection lines show prosodic dependence. The hyphens appearing on the left of the clitoris indicates that the clitistic is prosodically dependent on the word immediately to the left ( He will , There ), while the dashes appear on the right side of the clitic ( not shown here) shows that clitic is prosodic depending on the word appearing immediately to its right. A given clitic is often prosodis dependent on its syntactic dependence ( He will , There ) or in his head ( be ). At other times, it can depend prosodically on a word that is not a head or a direct dependent ( Florida ).

Syntactic dependency

Syntactic dependency is the focus of most jobs in grammatical dependencies, as stated above. How the existence and direction of syntactic dependency is determined is of course often open to debate. In this case, it must be admitted that the validity of syntactic dependence on trees throughout this article is deemed appropriate. However, this hierarchy is such that many dependency grammars can largely support it, although there will inevitably be points of contradiction. The basic question of how syntactic dependence has proved difficult to answer definitively. However, one must recognize in this field that the basic task of identifying and differentiating attendance and dependency dependencies on grammar dependencies is not easier or more difficult than determining the constituent constituent grouping of constituents. Various heuristics are used for this purpose, basic constituency tests become useful tools; the syntactic dependencies assumed in the trees in this article group words together in a way that best approximates standard permutation, substitution, and ellipsis constituent test results. Etymological considerations also provide helpful clues about the direction of dependence. The principle that promises to underlie the existence of syntactic dependencies is distribution. When a person attempts to identify the root of a given phrase, the word most responsible for determining the distribution of the phrase as a whole is the root.

Persian Language Resources Based on Dependency Grammar - ppt download
src: slideplayer.com


Linear order and discontinuity

Traditionally, Dirjen has a different approach to the linear sequence (word order) of constituent grammar. The dependency-based structure is minimal compared to its constituency-based partners, and this minimal structure allows one to focus seriously on two order dimensions. Splitting the vertical dimension (hierarchical order) from the horizontal dimension (linear sequence) is easy to achieve. This dependency-based structure aspect has enabled DGs, beginning with TesniÃÆ'¨re (1959), to focus on a hierarchical sequence in a way that is almost impossible for constituent grammar. For TesniÃÆ'¨re, the linear order is secondary to hierarchical sequence as far as hierarchical order precedes a linear sequence in the mind of a speaker. The trunks (trees) produced by TesniÃÆ'¨re reflect this view; they separate themselves from the linear sequence to focus almost entirely on the hierarchical order. Many DGs who follow TesniÃÆ'¨re adopt this practice, that is, they produce tree structures that reflect a hierarchical sequence only, for example

The traditional focus on the hierarchical order suggests that the Director General does not talk much about the linear order, and it contributes to the view that the Directorate General is particularly suited to check the language in free word order. The negative result of this focus on the hierarchical sequence, however, is that there is a scarcity of dependence-based exploration of certain word sequence phenomena, such as standard discontinuities. Comprehensive comprehensive grammatical accounting of topicization, wh - favorites, scrambles, and extrapositions is largely absent from many existing dependency-based frameworks. This situation can be contrasted with constituent grammar, which has devoted tremendous effort to exploring these phenomena.

The nature of the dependence relationship does not, however, prevent a person from focusing on a linear sequence. Dependency-based structures such as being able to explore the phrase sequence of words as constituent-based structures. The following trees illustrate this; they represent one way of exploring discontinuities using dependency-based structures. The trees suggest ways in which general discontinuities can be overcome. An example of German is used to describe the discontinuities of randomization:

The-tree on the left shows a proxy violation (= line intersection), and the b-tree on the right shows one way of overcoming this violation. The refugee constituency takes the word as its head instead of its governor. Words in the red marks catena (= chain) words that extend from the roots of the constituent refugees to the governor's constituents. Discontinuities are then explored in terms of this catenae. Limitations on the topicization, wh - exclude, scramble, and extraposit can be explored and identified by examining the nature of the catenae involved.

Diversity And Dependencyesis Of Moral Relativism Esl Meaning ...
src: skatearea.com


Syntactic function

Traditionally, DG has been treated syntactic functions (= grammatical functions, grammatical relationships) as primitives. They place an inventory of functions (eg subject, object, oblique, determiner, attribute, predicative, etc.). These functions can appear as labels on dependencies in the tree structure, e.g.

Syntactic functions in this tree are displayed in green: ATTR (attribute), COMP-P (complete preposition), COMP-TO (complement of), DET (determiner), P-ATTR (prepositional attribute), PRED (predicative)), SUBJ (subject), TO-COMP (to complete). The selected functions and abbreviations used in the tree here are merely representative of the general attitude of DGs to the syntactic function. The actual inventory of functions and designations varies from Directorate General to DG.

As a primitive theory, the status of these functions is much different than in some constituent grammars. Traditionally, constituent grammar takes the syntactic function of the constellation. For example, objects identified as NPs appear within a restricted VP, and the subject as NP appears outside of a restricted VP. Because the DG rejects the existence of limited VP constituencies, they are never presented with the option to view syntactic functions in this way. The problem is the question of what comes first: traditionally, the Director General takes the syntactic function of being primitive and they then obtain a constellation of these functions, while the constituent grammar traditionally takes the constellation to be primitive and they then obtain the syntactic function of the constellation.

This question of what comes first (function or constellation) is not an inflexible issue. Attitudes of both types of grammars (dependency and grammar constituents) are not limited to traditional views. Dependency and constituency are both fully compatible with both approaches to syntactic functions. Indeed, the monostratal system, whether they are dependent or constituent based, is likely to reject the idea that the function is derived from the constellation or that the constellation is derived from the functions. They will take both of them into primitives, which means can not be derived from others.

Speech and Language Processing - ppt download
src: slideplayer.com


See also


Persian Language Resources Based on Dependency Grammar - ppt download
src: images.slideplayer.com


Note


Persian Language Resources Based on Dependency Grammar - ppt download
src: slideplayer.com


References


Persian Language Resources Based on Dependency Grammar - ppt download
src: slideplayer.com


External links

  • Universal Dependency - set of tree banks in harmonized dependent grammar representation

Source of the article : Wikipedia

Comments
0 Comments