Chong Zhang: dissertation defense (May 5)

Chong Zhang defended her dissertation with the title "Stacked relatives: their structure, processing, and computation" on Friday, May 5. Congratulations!
This dissertation investigates the syntactic structure, processing effects, and computation of stacked relative clauses (RCs). Two structurally different languages are studied: English and Mandarin Chinese. The central question is to what extent the difficulty of processing a given sentence (modulo ambiguity) depends on how much working memory is required to compute its structure. The memory load is quantified by complexity metrics that are defined in a computationally rigorous fashion. The conclusion is that both memory load and memory reactivation need to be considered in building a cognitive model for language processing. 
Based on the grammatical function of the relativized noun in the RC, a RC could either be a S(ubject)RC or an O(bject)RC. For single RC processing, a robust SRC advantage has been reported for English (King&Just, 1991, a.o.), but studies on Mandarin found both SRC and ORC advantages (Lin&Bever, 2006; Gibson&Wu, 2013, a.o.). In this dissertation, the experimental study of the first RC in stacked RCs replicates the SRC advantage in English, and the ORC advantage in Mandarin. Crucially, these advantages disappear in the second RC. The dominant effect in RC2 processing is parallelism, i.e. stacked RCs with the same relativization types (SRC-SRC & ORC-ORC) are processed significantly faster than their counterparts with different relativization types (ORC-SRC & SRC-ORC). 
The parallelism effects are very pronounced and difficult to capture. They are unexpected under any of the current psycholinguistic models, such as Active-Filler strategy (Frazier 1987) or DLT (Gibson 2000). Among 700,000 tested metrics based on notions of memory load, none replicate the observed processing preferences. However, a metric that keeps track of feature-reactivation in the parser accounts for the parallelism effect. This suggests that a cognitive model for language processing needs to take into account not only memory load but also memory reactivation. Thus, this dissertation points out a general direction for building a cognitive model of language processing.