Author Archives: Michael

Lingle on Wednesday November 29

The first Lingle (Linguist Mingle) of the year will happen on Wednesday, November 29 at 5:30. Lingles are events for all of our various linguistics undergraduate majors. They are partly social and partly informational. This Lingle will include:

1. Food and socializing
2. An info session (optional!) on computational linguistics, including a Q&A on jobs in linguistics with one of our recent graduates.

Everyone is welcome. Please join if you’re able!

Susi Wurmbrand colloquium Friday Nov 18

Susi Wurmbrand (Harvard University) will present “Implicational complementation hierarchies: Containment and the freedom of syntax” on Friday November 18, 2022 at 3:30pm as part of the Linguistics colloquium series. The presentation will be both in-person in S331 in the ILC and available through Zoom. Abstract can be found below. All are welcome!

Typological and cross-linguistic observations show that complementation configurations can be ranked according to their semantic properties, forming an implicational complementation hierarchy along which syntactic or morphological distinctions operate. I suggest a model where the cross-linguistically stable (possibly universal) properties follow from a rigid syntax−semantic mapping of categories defined via containment, whereas variable properties indicate the points where syntax may act autonomously.  I will discuss several phenomena where implicational relations have been observed (among them finiteness, transparency, restructuring, the left periphery) and show that they can be related to truncation options (whether implemented via exfoliation, structure removal or non-projection) regulated by containment.

Roeper in Germany

Tom Roeper reports:

The GALA 15 conference (Generative Approaches to Language Acquisition) took place in Frankfurt (sept 22-240)–organized by Petra Schulz (former visitor here) with an invited address by Ana Perez, former student here,  and with papers by  former visitors Angeliek van Hout, Camelia Bleotu (with Tom Roeper), and Petra Schulz, and LAWNE member William Snyder, and a poster by Uscha Lakschmann, Deb Foucault, and Tom Roeper.

       A  very nice memorial was held for Jürgen Weissenborn at the GALA Conference in Frankfurt on Sept 24th.  Jürgen died at 83 on Feb 28th.  He was a frequent visitor at UMass,  had many friends here (including Peggy Speas, Barbara Partee, and others).  He married one of our students, Janet Randall,—who attended by zoom.   His wife Bettina, daughter Pia, and 3 grandchildren came. In addition to many remembrances, letters from Barbara Partee and Jill deVilliers were read.

Jürgen collaborated with Jill deVilliers and me on acquisition of long-distance rules,  with Angeliek van Hout on auxiliary learning, with Sandra Waxman on word-learning, among many topics he worked on.      

I also gave a lecture at ZAS in Berlin on “Minimal Interfaces as a Guide to L2, and the formulation of the Thought-Cognition connection”, and the same talk (more or  less) at the Conference in Wuppertal on “Optionality and Variation in Multilingual Syntax” organized by Leah BAuke (a former visitor here) with many leaders in the field (Antonella Sorace, Marit Wetergaard, Theresa Biberauer, among others).

UMass linguists and alumni at AMP 2022

This year’s Annual Meetings on Phonology (AMP) was hosted by UCLA October 21–23. The Annual Meetings on Phonology began as Phonology 2013 here at UMass.

Faculty member Gaja Jarosz gave a keynote address, Generalizing from Inconsistent Data: How Much do Exceptions Count?

Presentations from current students, faculty, and alumni included:

  • Alessa Farinella presented “Prosodic constituency in Tagalog”
  • Cerys Hughes presented “Probing a Neural Network Model of Sound Change for Perceptual Integration”
  • Brandon Prickett (PhD 2021) presented “Is Sour Grapes Learnable? A Computational and Experimental Approach”
  • Seung Suk Lee, Cerys Hughes, Alessa Farinella and Joe Pater presented “Learning stress with feet and grids”
  • Aleksei Nazarov (PhD 2016) and and Brian Smith (PhD 2015) presented “Generalizing French schwa deletion: the role of indexed constraints”
  • Anne-Michelle Tessier (PhD 2007), Karen Jesney (PhD 2011), Kaili Vesik, Roger Lo and Marie-Eve Bouchard presented “The Productive Status of Canadian French Liaison: Variation across Words and Grammar”

Seoyoung Kim to Alexa AI

After successfully defending her dissertation in July, Seoyoung Kim has accepted a position as Language Data Scientist working for Alexa AI in Seattle Washington. She will be handling unique data analysis and research requests that support the training and evaluation of machine learning models and the overall processing of a language data collection. Wow! Congratulations, Seoyoung!

Seoyoung Kim successfully defends dissertation!

Seoyoung Kim successfully defended her dissertation “Restrictive Tier Induction” on Friday July 8. Pictured from left to right with the defense fish are Gaja Jarosz, Seoyoung Kim, and Michael Becker. Committee member Maria Gouskova (NYU) participated remotely. Congratulations Seoyoung!

Workshop on modification features UMass linguists

A workshop on modification that took place November 26-27th, was organized by current visitor Camelia Bleotu and faculty member Deborah Foucault. Three of our undergrads were invited to present: Tyler Poisson, Sarah Kim, and Mirella Vladova. The invited speaker was Tom Roeper.

WORKSHOP ON MODIFICATION (organizat de Adina Camelia Bleotu & Deborah Foucault)
Tyler Poisson (UMass Amherst): The Modificational Possessive: natural, intuitional, and experimental evidence for a syntactic analysis of generic possessives.
Sarah Kim (UMass Amherst): Acquisition of Exhaustivity for the English Definite Article in Speakers of Languages with Article Absence
Mirella Vladova (UMass Amherst): A Look Into Children’s Priority in Genitive and Prepositional Recursion.
Ioana-Amalia Luciu (University of Bucharest) & Adina Camelia Bleotu (University of Bucharest, ZAS): How Are Size, Age, Shape and Color Adjectives Ordered in English and Romanian? An Experimental Investigation
Daniela-Gabriela Truşcǎ (University of Bucharest) & Adina Camelia Bleotu (University of Bucharest, ZAS): An Experimental Investigation of the Ordering of Quality, Size and Color Adjectives in English and Romanian
 Vorbitor invitat:Tom Roeper  (UMass Amherst)How to put something inside itself

Ukrainian visiting linguist Khrystyna Kunets featured

The Daily Hampshire Gazette featured our recent visiting Fulbrighter from Ukraine, Khrystyna Kunets: “Normally, Ukrainian linguist Khrystyna Kunets spends her workday like many academics do, teaching and researching topics to which she’s devoted her professional career, such as semantics, syntax and translation studies…”

Living Languages – new journal from UMass

The first international, multilingual journal entirely dedicated to indigenous and minoritized language revitalization and sustainability was launched at UMass last month. LIVING LANGUAGES – LENGUAS VIVAS – LÍNGUAS VIVAS is an open access journal hosted by ScholarWorks@UMassAmherst The journal’s first volume can be found here: 

The editors-in-chief are our colleague Luiz Amaral (Spanish and Portuguese Studies) and Gabriela Pérez Báez, and the editorial board includes Michael Becker. The journal’s three launch events took place February 21, International Mother Language Day, 2022 being the first year of the International Indigenous Languages Decade (2022-2032).  You can watch the events on YouTube using the following links:

Gouskova colloquium Friday Feb 18 at 3:30

Maria Gouskova (New York University) will present “Morpheme Structure Constraints Revisited” in the Linguistics colloquium series at 3:30 Friday February 18, by Zoom. An abstract follows. All are welcome!

Most constraint-based frameworks embrace Richness of the Base: the assumption that no interesting generalizations are stated as constraints on the lexicon (a.k.a. Morpheme Structure Constraints, or MSCs). The main argument against MSCs is that they introduce duplication into the theory. When the same constraints define the shapes of morphemes and restrict derived words, the latter, surface-oriented constraints should be sufficient. Unlike MSCs, surface-oriented constraints are less abstract, and are independently necessary. This echoes earlier criticisms of MSCs: they are redundant, abstract, and unlearnable.

In this talk, I revisit MSCs in the context of Russian voicing. Russian voicing was Morris Halle’s (1959) original battleground against structuralism—which he, incidentally, also criticized for having a duplication problem. By treating contrastive oppositions differently from non-contrastive ones, structuralism fails to capture the generalization that Russian voicing assimilation works on all obstruents alike, whether they contrast for voicing phonemically (/b/ vs. /p/) or are obligatorily voiceless (e.g., /tʃ/). My concern is not the undergoers; rather, it is the lack of certain contrasts predicted by the popular Positional Faithfulness account of voicing neutralization in Optimality Theory (Lombardi 1999 and many others). I will show that even though this account captures the phonetics and typology of voicing contrasts, it has a problem with Russian. There are several alternatives, but all encounter some kind of a duplication problem. I will argue for MSCs against consonants such as the affricate /dʒ/ in the lexicon. Another alternative would include a host of markedness constraints covering positions where [dʒ] does not occur, but this move introduces a duplication into the analysis: the phonology of certain consonants must be handled twice. These constraints, moreover, are neither phonetically grounded nor formally sensible; all they do is plug the holes in the analysis.

My account, like everyone else’s, has a duplication problem. But unlike other analyses, it explains facts such as the handling of loanword [dʒ], which is borrowed as a CC cluster in Russian, and which behaves as though it is never represented as an affricate in the system. I conclude with a discussion of a learnability proposal for MSCs within a constraint-based framework, Minimum Description Length (Rasin & Katzir 2016). I discuss some complications that arise in applying Minimum Description Length to learning certain kinds of hidden structure, especially structure that allows words to be shorter and grammars to be simpler