Meredith Landman on Variables in Natural Language

Ever since Quine’s “On What There Is”, discussions of the types of variables in natural languages have occupied a special place in semantics. According to Quine, “to be assumed as an entity is, purely and simply, to be reckoned as the value of a variable.” After eleven years in the archives, Meredith Landman’s landmark 2006 dissertation on Variables in Natural Language has now been made publicly available on ScholarWorks. Landman’s dissertation argues for severe type restrictions for object language variables in natural languages, targeting pro-forms of various kinds, elided constituents, and traces of movement.

In his 1984 UMass dissertation Gennaro Chierchia had already proposed the ‘No Functor Anaphora Constraint’, which says that ‘functors’ (e.g. determiners, connectives, prepositions) do not enter anaphoric relationships. Landman’s dissertation goes further in arguing for a constraint that affects all object language variables and also rules out properties as possible values for them. Her ‘No Higher Types Variable Constraint’ (NHTV) restricts object language variables to the semantic type e of individuals.

Landman explores the consequences of the NHTV for the values of overt pro-forms like such or do so, as well as for gaps of A’-movement and for NP and VP ellipsis. Since the NHTV bars higher type variables in all of those cases, languages might have to use strategies like overt pro-forms or partial or total syntactic reconstruction of the antecedent to interpret certain types of movement gaps and elided constituents. The NHTV thus validates previous work arguing for syntactic reconstruction and against the use of higher-type variables (e.g. Romero 1998 and Fox 1999, 2000), as well as work arguing for treating ellipsis as involving deletion of syntactic structure.

The topic of the type of traces has most recently been taken up again in Ethan Poole’s 2017 UMass dissertation, which contributes important new evidence confirming that the type of traces should indeed be restricted to type e.

(This post was crafted in collaboration with Meredith Landman, who also provided the pictures).

The 2017 David Lewis Lecture

david LewisI feel so honored and happy to be giving the 2017 David Lewis Lecture in Princeton. David Lewis was the most important influence on me as I was mapping out the path I wanted to take as a linguist and semanticist. Mysteriously, the handwriting on the poster is Lewis’s very own handwriting.

David Lewis’s General Semantics (Synthese 22, 1970) was the work that turned me into a semanticist. I was introduced to the article in a Konstanz seminar with Yehoshua Bar-Hillel. I still consider General Semantics the most important milestone in the history of formal semantics for natural languages. In that paper, Lewis teaches us how to connect formal semantics to Chomsky’s Aspects model, for example: “I have foremost in mind a sort of simplified Aspects-model grammar (Chomsky, 1965), but I have said nothing to eliminate various alternatives.” Lewis shows how an insightful theory of semantics and pragmatics can be brought together with an explanatory theory of syntax of the kind Chomsky pioneered. General Semantics is, I believe, the first work that presents a compositional theory of meaning that unifies the perspectives of generative syntax with those of formal logic and analytic philosophy. I think David Lewis’s work was a factor in putting an end to the ‘Linguistics Wars’. It made clear that formal semantics (and pragmatics) and syntactic theory in the spirit of Chomsky could travel together peacefully.

Lewis’s Adverbs of Quantification was a major inspiration for Irene Heim’s and my dissertations. It is the source of the idea that indefinites introduce variables that can be unselectively bound by independent sentential operators and contains the seeds of the restrictor view of if-clauses. Current pragmatic theory would not be what it is today without Convention and Scorekeeping in a Language Game: Contemporary game-theoretical pragmatics, theories of presupposition accommodation, the idea of scoreboards keeping track of salient features of discourse, and context-dependent theories of relative modality all have their roots in those two works. What made Lewis’s ideas so powerful was that they were launched in beautiful prose and with minimal technical machinery. This is why they could so easily cross disciplinary borders.

Robert Stalnaker


From MIT News Office

“While working in a famously esoteric field, MIT philosopher Robert Stalnaker has focused his career on thinking about real-world concerns — including the fundamental nature of speech, thought, and decision-making. In so doing, he has catalyzed and provided the underpinnings for new research in many other areas, such as game theory, linguistics, decision theory, and economics. In all these research areas, Stalnaker’s influence has been widespread and profound, but his impact on modern linguistics — a field that was just coming into its own in the 1970s — has been especially significant, providing the first clear understanding of what is going on in conditional sentences that are counterfactual.”

Logic and Grammar


Sandro Botticelli: A Young Man Being Introduced to the Seven Liberal Arts

Why do I make my semantics students learn logic? I ask them to work through both volumes of the Gamut textbook, even though Gamut doesn’t speak the language of linguistics. It is written in the language of logic. Why should semantics students have to learn how to talk and reason in this way? There is a simple answer: In an interdisciplinary field everyone from any participating field has to speak the language of the other fields. That’s your entrance ticket for success in an interdisciplinary enterprise. You have to understand where the practitioners of other fields are coming from. As a relatively new interdisciplinary field, formal semantics has been a success. It is the result of the marriage of two highly formalized and abstract theories: Logic, which provides theories of the human notion of what a valid piece of reasoning is, and Syntax, which contributes theories of how hierarchical syntactic structures are computed in natural languages. The marriage is solid and has been going strong for almost 50 years. Many young linguists, logicians, and philosophers are fluent in three disciplines, and collaborate in joint research institutions, journals, and conferences.

You may have heard people say that theories of logic can’t be cognitive theories because people make logical mistakes. Yes, we all do make logical mistakes. What is important, though, is that, when we do, we can be convinced that we were wrong. How come? There must be a notion of what a valid piece of reasoning is that is the same for all human beings. Imagine what the world would be like if people all had different notions of what follows from what and what is or isn’t consistent. Mathematics would be impossible, science would be impossible, laws and contracts would be impossible, social institutions would be impossible, … For more than 2000 years, logicians have been designing theories of universally shared patterns of valid human reasoning. The resulting theories are among the most sophisticated theories science has produced to date. And they are the most sophisticated formal theories in cognitive science. One of the key insights of the early logicians was the discovery that little words like notandorsomeallmustmay, and so on are the main players in patterns of valid reasoning. That is, those patterns are created by properties of the functional (that is, logical) vocabularies of human languages. It’s precisely those vocabularies that also provide the scaffolding for syntactic structures. Syntax is about the hierarchical structures projected from the functional vocabularies of natural languages, Logic provides the models of how to study the meanings of those vocabularies and how to explain their role in reasoning. In formal semantics, those two disciplines have come together.

Contemporary modern semantics was born when the traditional perspectives of logic merged with the modern enterprise of generative syntax, as initiated by Noam Chomsky. The first worked out formal semantic system in this tradition was David Lewis’ 1970 paper General Semantics, one of the most beautiful and enjoyable articles in semantics to the present day. Lewis made an explicit connection with Chomsky’s Aspects model, the generative syntax model of the time. In contrast to Lewis, Richard Montague was outspokenly hostile to Chomsky’s work. He was not interested in Chomsky’s call for an explanatory syntax. It was only after Montague’s death that linguists like David Dowty, Lauri Karttunen, Barbara Partee, Stanley Peters, and Robert Wall made Montague’s works accessible to linguistic audiences.

Situations in Natural Language Semantics

Gilles Trehin: Urville. Source: Gizmodo

The Stanford Encyclopedia of Philosophy has recently implemented a redesign of its website. My article on Situations in Natural Language Semantics appears in a new look.

“Situation semantics was developed as an alternative to possible worlds semantics. In situation semantics, linguistic expressions are evaluated with respect to partial, rather than complete, worlds. There is no consensus about what situations are, just as there is no consensus about what possible worlds or events are. According to some, situations are structured entities consisting of relations and individuals standing in those relations. According to others, situations are particulars. In spite of unresolved foundational issues, the partiality provided by situation semantics has led to some genuinely new approaches to a variety of phenomena in natural language semantics. In the way of illustration, this article includes relatively detailed overviews of a few selected areas where situation semantics has been successful: implicit quantifier domain restrictions, donkey pronouns, and exhaustive interpretations. It moreover addresses the question of how Davidsonian event semantics can be embedded in a semantics based on situations. Other areas where a situation semantics perspective has led to progress include attitude ascriptions, questions, tense, aspect, nominalizations, implicit arguments, point of view, counterfactual conditionals, and discourse relations.”

There is a lot of recent work on domain restrictions in situation semantics, in particular on domain restrictions for definite descriptions:

Paul Elbourne’s 2002 MIT dissertation, his 2005 book on Situations and Individuals, and his 2013 book on Definite Descriptions. “My contention in this book is that definite descriptions are best analyzed semantically as expressions that contain a locally free situation variable; when the situation variable is bound or assigned a referent, the definite description ranges over or refers to individuals. So I will be working with a semantics based on situations” (from Definite Descriptions, p. 17). Elbourne also exploits situation variables for a theory of presupposition projection. 

Ezra Keshet’s 2008 MIT dissertation and his 2010 Natural Language Semantics article on Situation Economy: ” … a rule of Situation Economy is advanced, which holds that structures must have the fewest number of situation pronouns possible. Strong DPs require a situation pronoun to receive a de re reading, and therefore a restriction on the type of strong determiners is proposed, which supersedes Situation Economy in this case.”

Florian Schwarz’s 2009 UMass Amherst dissertation and his 2012 Natural Language Semantics article on Situation Pronouns in Determiner Phrases: “This paper is primarily concerned with situation pronouns inside of determiner phrases, arguing that they are introduced as arguments of (certain) determiners. Verbal predicates, on the other hand, are assumed to not combine with a situation pronoun. The various restrictions on their interpretation are shown to fall out from the semantic system that is developed based on that view.”

Dictionary of Untranslatables: a Philosophical Lexicon


Dictionary of Untranslatables: a Philosophical Lexicon
Source: Publisher

Dictionary of Untranslatables

“This is an encyclopedic dictionary of close to 400 important philosophical, literary, and political terms and concepts that defy easy–or any–translation from one language and culture to another. … The entries, written by more than 150 distinguished scholars, describe the origins and meanings of each term, the history and context of its usage, its translations into other languages, and its use in notable texts. The dictionary also includes essays on the special characteristics of particular languages–English, French, German, Greek, Italian, Portuguese, Russian, and Spanish.”

Connections: Quine’s thesis of the Indeterminacy of Translation, via the Stanford Encyclopedia of Philosophy.

Automatic pilot for the garden of forking paths?

From Kurzweil Accelerating Intelligence on Patrick Tucker’s The Naked Future: “Computer scientist Stephen Wolfram, and futurist Ray Kurzweil have famously painstakingly recorded every minute detail of their lives, from their diets to the keystrokes, in order to quantify and better their lives. Now, technology has made self-quantification easier than ever, allowing the “everyman” to record and study their habits just as Wolfram and Kurzweil have done, but with less hassle… So what happens in a future that anticipates your every move? The machines may have a better handle on us than ever, but we’ll live better as a result.  The naked future is upon us, and the implications for how we live and work are staggering.”


Source: The Modern Word. Borges.

“In all fictional works, each time a man is confronted with several alternatives, he chooses one and eliminates the others; in the fiction of Ts’ui Pên, he chooses— simultaneously—all of them. He creates, in this way, diverse futures, diverse times which themselves also proliferate and fork.” The Garden of Forking Paths by Jorge Luis Borges.

From the Stanford EncyclopediaBranching time semantics: “As an explicit (or formalised) idea, branching time was first suggested to Prior in a letter from Saul Kripke in September 1958. This letter contains an initial version of the idea and a system of branching time, although it was not worked out in details. “

More on branching time semantics: Around the tree. Semantic and Metaphysical issues concerning branching and the open future.

The Grammar of individuation and counting

Suzi Oliveira de Lima: The grammar of individuation and counting. 2014 UMass dissertation.


Suzi Oliveira de Lima: From personal website

Are there languages that do not draw a grammatical distinction between count nouns and mass nouns? Some scholars have said there aren’t. Others have claimed that there are languages where all non-referential nouns are mass nouns. Henry Davis and Lisa Matthewson (1999) argued that in the Salish language St’át’imcets, all non-referential nouns are count nouns. Suzi Lima has been investigating another language with this property: the Tupi language Yudja (Juruna family). Lima’s dissertation is a game changer in fieldwork methodology: her findings do not just rely on the by now standard elicitation tasks for semantic fieldwork, but use a wider range of experimental techniques, including quantity judgment tasks and production and comprehension studies with children and adults.

Related work from SULA 8: Andrea Wilhelm made a case that in Dëne Sųłiné (Chipewyan) all nouns are referential. Nouns either denote individuals or kinds, they do not have predicative denotations at all. Amy Rose Deal suggested that in Nez Perce (Niimiipuutímt, Sahaptian), all notional mass nouns can have both count and mass denotations. What is emerging from this cross-linguistic work, then, is that languages have options for construing noun denotations. The possible options seem to be: reference to individuals, reference to kinds, singular, plural, or number-neutral atomic properties, and non-atomic properties. There are repercussions of whatever option is chosen by a language. A language that has no mass nouns should not have measure phrases, for example, and this is so for Yudja, as Lima shows. A language where all nouns are referential should not have intersective adjectives or restrictive relative clauses, and this is so for Dëne Sųłiné, as Wilhelm shows.  

Most recent work on the count-mass distinction in natural languages responds in one way or other to Gennaro Chierchia’s influential Reference to Kinds Across Languages, which is one of the most downloaded papers for Natural Language Semantics. With over 1400 citations, it is also one of the most cited papers in semantics. 

Philosophical conversations with Sarah Jane Leslie

sarah-jane-leslieSarah-Jane Leslie is anchoring conversations with well-known philosophers in a series entitled “Philosophical Conversations.” The series has conversations with Joshua Knobe, Rae Langton, Kwame Anthony Appiah, Roger Scruton, and Elizabeth Harman. Upcoming interviews will include Sarah-Jane’s father, the cognitive psychologist and autism expert Alan Leslie. The series can also be accessed via the Sanders Foundation’s YouTube channel.

Source: Marc Sanders Foundation.

Individuals with episodic amnesia are not stuck in time

Carl F. Craver, Donna Kwan, Chloe Steindam, R. Shayna Rosenbaum: Individuals with episodic amnesia are not stuck in time. Neuropsychologia 57, May 2014, Pages 191 -195.

“The idea that episodic memory is required for temporal consciousness is common in science, philosophy, fiction, and everyday life. Related ideas follow naturally: that individuals with episodic amnesia are lost mariners, stuck in time, in a “permanent present tense” or “lost in a non-time, a sort of instantaneous present”. Yet recent evidence suggests that people with episodic amnesia are not stuck in time. Episodic memory and future thought are dissociable from semantic knowledge of time, attitudes about time, and consideration of future consequences in decision-making. These findings illustrate how little is known about the sense of time in episodic amnesia and suggest that the human sense of time is likely not one thing, but many things.”