Chopping down the Syntax Tree

Word-tree-5Blog post by Remi van Trijp based on a recent article in Language and Cognition

One of the most notorious problems in linguistics is how to handle “long-distance dependencies”: utterances in which some elements seem to have been taken away from their original position and then moved to a different place. Typical examples are WH-questions such as “What did you see” in which the direct object (“what”) takes sentence-initial position instead of following the verb, as it would do in a declarative utterance (e.g. “I saw the game”).

But what makes long-distance dependencies so difficult? Most linguists assume a tree structure (or “phrase structure”) for analyzing utterances. As a data structure, trees consist of nodes that have at most one parent node, which means that information in a tree can only trickle down from a parent to its immediate children, or percolate upwards in the other direction. A tree structure is thus hopelessly inadequate for representing dependencies between nodes that are in the top of the hierarchy and nodes that are situated somewhere below. The most common solution to this problem is to say that there is a “gap” where we would normally expect a part of the utterance. Information about the gapped element then has to be communicated node-by-node upwards in the tree, until the “filler” of the gap is found.

In recent years, however, a cognitive-functional alternative has started to crystallize in which long-distance dependencies spontaneously emerge as a side effect of how grammatical constructions interact with each other in order to cater for the different communicative needs of language users. For example, the difference between “I like ice cream” and “Ice cream I like” can be simply explained as the tendency for speakers to put the most topical information in the front of the sentence – suggesting that word order should be decoupled from an utterance’s hierarchical structure.

While this view has for a long time been dismissed for being ad-hoc and not lending itself to proper scientific formalization, there now exists a formally explicit computational implementation of the cognitive-functional alternative in Fluid Construction Grammar, which works for both parsing and production. The implementation eliminates all formal machinery needed for filler-gaps by chopping down the syntax tree: rather than taking a tree structure as the sole device for representing all information of an utterance, different linguistic perspectives are represented on equal footing (including an utterance’s information structure, functional structure, illocutionary force, and so on).

The implementation shows that a cognitive-functional approach to long-distance dependencies outperforms the filler-gap analysis in several domains: it is more parsimonious, more complete (i.e. it includes a processing model) and it offers a better fit to empirical data on language evolution.

Access the entire article without charge until 31st July 2014

Leave a Reply

 

 

 

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>