Moore-Cantwell and Pater (2016): Gradient Exceptionality in Maximum Entropy Grammar with Lexically Specific Constraints

Direct link:

Moore-Cantwell, Claire and Joe Pater (2016).  Gradient Exceptionality in Maximum Entropy Grammar with Lexically Specific Constraints. To appear in Bonet, Eulàlia & Francesc Torres-Tamarit (eds.), Catalan Journal of Linguistics 15.

Comments welcome!

Abstract. The number of exceptions to a phonological generalization appears to gradiently affect its productivity. Generalizations with relatively few exceptions are relatively productive, as measured in tendencies to regularization, as well as in nonce word productions and other psycholinguistic tasks. Gradient productivity has been previously modeled with probabilistic grammars, including Maximum Entropy Grammar, but they often fail to capture the fixed pronunciations of the existing words in a language, as opposed to nonce words. Lexically specific constraints allow existing words to be produced faithfully, while permitting variation in novel words that are not subject to those constraints. When each word has its own lexically specific version of a constraint, an inverse correlation between the number of exceptions and the degree of productivity is straightforwardly predicted.


Leave a Reply

Your email address will not be published. Required fields are marked *