Logic and Grammar


Sandro Botticelli: A Young Man Being Introduced to the Seven Liberal Arts

Why do I make my semantics students learn logic? I ask them to work through both volumes of the Gamut textbook, even though Gamut doesn’t speak the language of linguistics. It is written in the language of logic. Why should semantics students have to learn how to talk and reason in this way? There is a simple answer: In an interdisciplinary field everyone from any participating field has to speak the language of the other fields. That’s your entrance ticket for success in an interdisciplinary enterprise. You have to understand where the practitioners of other fields are coming from. As a relatively new interdisciplinary field, formal semantics has been a success. It is the result of the marriage of two highly formalized and abstract theories: Logic, which provides theories of the human notion of what a valid piece of reasoning is, and Syntax, which contributes theories of how hierarchical syntactic structures are computed in natural languages. The marriage is solid and has been going strong for almost 50 years. Many young linguists, logicians, and philosophers are fluent in three disciplines, and collaborate in joint research institutions, journals, and conferences.

You may have heard people say that theories of logic can’t be cognitive theories because people make logical mistakes. Yes, we all do make logical mistakes. What is important, though, is that, when we do, we can be convinced that we were wrong. How come? There must be a notion of what a valid piece of reasoning is that is the same for all human beings. Imagine what the world would be like if people all had different notions of what follows from what and what is or isn’t consistent. Mathematics would be impossible, science would be impossible, laws and contracts would be impossible, social institutions would be impossible, … For more than 2000 years, logicians have been designing theories of universally shared patterns of valid human reasoning. The resulting theories are among the most sophisticated theories science has produced to date. And they are the most sophisticated formal theories in cognitive science. One of the key insights of the early logicians was the discovery that little words like notandorsomeallmustmay, and so on are the main players in patterns of valid reasoning. That is, those patterns are created by properties of the functional (that is, logical) vocabularies of human languages. It’s precisely those vocabularies that also provide the scaffolding for syntactic structures. Syntax is about the hierarchical structures projected from the functional vocabularies of natural languages, Logic provides the models of how to study the meanings of those vocabularies and how to explain their role in reasoning. In formal semantics, those two disciplines have come together.

Contemporary modern semantics was born when the traditional perspectives of logic merged with the modern enterprise of generative syntax, as initiated by Noam Chomsky. The first worked out formal semantic system in this tradition was David Lewis’ 1970 paper General Semantics, one of the most beautiful and enjoyable articles in semantics to the present day. Lewis made an explicit connection with Chomsky’s Aspects model, the generative syntax model of the time. In contrast to Lewis, Richard Montague was outspokenly hostile to Chomsky’s work. He was not interested in Chomsky’s call for an explanatory syntax. It was only after Montague’s death that linguists like David Dowty, Lauri Karttunen, Barbara Partee, Stanley Peters, and Robert Wall made Montague’s works accessible to linguistic audiences.

The man who tried to redeem the world with logic


Source: Nautilus

From Nautilus:

“Though they started at opposite ends of the socioeconomic spectrum, McCulloch and Pitts were destined to live, work, and die together. Along the way, they would create the first mechanistic theory of the mind, the first computational approach to neuroscience, the logical design of modern computers, and the pillars of artificial intelligence. But this is more than a story about a fruitful research collaboration. It is also about the bonds of friendship, the fragility of the mind, and the limits of logic’s ability to redeem a messy and imperfect world.”

“The moment they spoke, they realized they shared a hero in common: Gottfried Leibniz. The 17th-century philosopher had attempted to create an alphabet of human thought, each letter of which represented a concept and could be combined and manipulated according to a set of logical rules to compute all knowledge—a vision that promised to transform the imperfect outside world into the rational sanctuary of a library.”

Kai von Fintel: Decoding the Meaning of Language

kai Kai von Fintel: “Linguistics is basically the science of language. You use a scientific approach, but you get to apply it to something central to humanity. We put these signals in the world and others can read our mind to some extent. I find that a baffling phenomenon — why not try to figure that out?” Full story by SHASS Communications.

“What makes linguistics, the science of language, so fascinating is that it exists at the intersection of science and the humanities. You use a scientific approach, and you get to apply it to something central to humanity.”

“We’re trying to find patterns in data, making hypotheses, throwing more data at it and seeing how it holds up,” he says. “We look at facts to distinguish what we can understand versus what we can’t.”