David Pereplyotchik, The Psychological Import of Syntactic Theory
My primary goal is to assess whether, and in what sense, the rules and principles of a grammar are psychologically real. I begin by casting doubt on a received view in generative linguistics, according to which a true theory of the syntax of natural language would, ipso facto, be a theory of a psychological state or mechanism. Following Michael Devitt, I argue that a nominalist construal of linguistic theory is a viable alternative to the dominant Chomskyan view that linguistics is a branch of psychology. If this is correct, then it follows that there are substantive issues about whether the theoretical constructs of formal linguistics play any role in psychological processes, and, if so, what role they play. To address these issues, I examine a range of behavioral and neurocognitive data from psycholinguistics. These data strongly suggest that a subpersonal mechanism—the human language processor—constructs mental representations of the syntactic properties of incoming linguistic stimuli. In an effort to characterize the role that the notion of mental representation plays in contemporary psycholinguistics, I go on to survey a number of computational models of human language comprehension. While all such models account for an impressive range of psycholinguistic data, they make use of the rules or principles of a grammar in one of two very different ways—either by explicitly representing them or by embodying them. It is reasonable to suppose, then, that grammars are psychologically real in one of these two ways. But which? To answer this question, I go on to sketch a theoretical framework for thinking about represented and embodied rules, distinguishing embodiment from mere conformity to a rule. I then argue that embodied rules are typically implemented by simpler mechanisms; embodiment is, therefore, the more parsimonious hypothesis (ceteris paribus). Furthermore, I show that we have no principled grounds, at present, for asserting that grammars are represented, rather than embodied, in the human brain. From this, I conclude that a common claim in generative linguistics, i.e., that grammars are represented in the minds of competent language users, must be seen as either as a conflation of the notions of embodiment and representation, or as simply an ungrounded hypothesis.
Full Dissertation (.pdf) Note: Large file. Please excuse the extended download time.
§§1-2: Katz holds that some fields, including linguistics, aim to supply a priori knowledge. He argues that Quine’s opposing position is inconsistent, because it entails that the very principles of reasoning are revisable. But there is a distinction between principles of reasoning and our theory of those principles. The latter can be revised without threat of contradiction. The former can change over time, but this is not revision, strictly-speaking.
§2.2: Katz, Postal, and others argue that, since natural languages generate nondenumerably many sentences, linguistics is not about any aspect of the natural world, but, rather, about “abstract entities.” But linguists’ claims concerning the infinitude of language do not reflect this dubious ontological commitment. Rather, they embody a methodological stance that warrants an idealization to the effect that natural languages generate (at most) denumerably many sentences. We should treat such claims as affirming the modal import of linguistic generalizations.
§§3-4: Two rival interpretations of linguistic theory remain. Chomsky (§3) holds that linguistics is a branch of psychology; syntactic theories are about the mind. Devitt (§4) holds that linguistics is about the high-level relational properties of concrete inscriptions, acoustic blasts, and the like.
Chapter 2: The Cognitive Conception and Its Discontents full text pdf
§1: It is sometimes thought that the aim of achieving explanatory adequacy (as against mere observational and descriptive adequacy) only makes sense on Chomsky’s view, not on Devitt’s. But it’s possible to motivate this reasonable aim solely by an appeal to the theoretical virtues of generality and explanatory unification, which Devitt endorses.
§2: Chomsky holds that the notion of an E-language is theoretically idle, and that the identity conditions on I-languages are more secure than those on E-languages. But E-language has a theoretical role to play in the theory of language acquisition. Moreover, individuating I-languages is no easier than individuating E-languages; both projects must idealize away from messy variation.
§1: Language comprehension typically requires computing mental representations of the syntactic structures of linguistic stimuli. (Call these mental phrase markers.)
§2: Neurocognitive studies using the violation paradigm have found an ERP signature that is elicited by, and only by, syntactically ill-formed stimuli. These studies show that the human sentence-processing mechanism (henceforth, HSPM) represents distinctly syntactic properties, and that this “hierarchically dominates” the representation of other linguistic properties.
§3: Structural priming experiments provide further evidence for the psychological reality of mental phrase markers, and help us determine their contents.
§4: Studies of garden-path phenomena have shored up principles of ambiguity resolution that form the foundation of most comprehension models. These principles all make ineliminable reference to mental phrase markers. Such studies furthermore support positing a stage in processing where only syntactic information plays a role in determining the HSPM’s attachment preferences.
§5: Cross-modal priming experiments suggest that wh-traces are psychologically real and that, in searching for them, the HSPM uses stimulus cues as well as considerable “knowledge” of grammatical constraints.
Chapter 4: On Two Attempts to Do Without Mental Phrase Markers full text pdf
§1: Two models that eschew mental phrase markers will be examined and rejected.
§2: In the 1980s, Roger Shank and his colleagues built language processing systems that relied solely on semantic and pragmatic information. Such systems represented no syntactic properties, except the linear order of the words in the input. Marcus (1984) shows that this is insufficient for processing syntactically complex constructions, including embedded clauses, multiple passivization, wh-questions, and anaphora.
§3: Devitt (2006) tentatively endorses a “brute-causal” (BC) model, according to which the HSPM does not construct representations of syntactic properties, but instead maps phonetic representations directly into thoughts. On this view, thoughts are syntactically similar to expressions in the hearer’s public language—a strengthened language of thought hypothesis (S. LOTH). I argue against this. First, the standard arguments for LOTH are not wholly compelling. But Devitt’s abductive argument for the stronger version (S. LOTH), i.e., that it explains the syntactic properties of public-language expressions, faces three additional difficulties. (i) The argument appeals to a dubious thesis concerning the explanatory priority of thought over language. (ii) Defensible alternatives that do not support S. LOTH can do the same explanatory work. (iii) The assumptions underlying the argument are in tension with other aspects of the BC model.
§3.3: The BC model claims that language comprehension is reflex-like, associative, and does not operate on structured metalinguistic representations. But there are compelling grounds for doubt about these claims. Finally, Devitt’s reply to a forceful objection to the BC model relies on a distinction between representing and responding. But the operations of the HSPM cannot be viewed as mere responses, because they synthesize information about stimuli from discontinuous regions of time. This last line of reasoning sheds light on the nature of mental representation.
§1-6: A question remains about whether the HSPM houses mental representations of syntactic rules or principles. And, if not, then in what sense (if any) are grammars psychologically real? To settle this issue, I look at a wide variety of computational parsing models, starting with the foundational Earley and CYK algorithms (§2), moving through Augmented Transition Networks (§4), Principles-based parsers (§5), and Minimalist parsers (§6). The grammar underlying each model is sketched and motivated. Along the way, I flesh out the idea that parsing is a species of natural deduction, and examine an old debate concerning the Derivational Theory of Complexity (§3).
§7: The upshot of these investigations is that, while grammars play an important role in language processing—and are, to that extent, psychologically real—there are, at present, no decisive reasons for thinking that grammars are declaratively represented in a separate data structure, rather than merely embodied as procedural dispositions. To clarify this claim, I look briefly at several prominent theories of mental representation, and examine a number of examples of embodied rules. The discussion also elucidates several important distinctions in cognitive science: personal/subpersonal, design-/intentional-stance, Lewis-/Putnam-functionalism, implicit/explicit, conscious/nonconscious, procedural/declarative, and dispositional/occurrent. In the end, mental grammars are argued to be a subpersonal analogue of dispositional beliefs. Given that the latter are psychologically real, so are the former.