CfP Neural network models for articulatory gestures (LabPhon Satellite Workshop)

Neural network models for articulatory gestures (satellite to LabPhon 17) calls for abstracts that bring together articulation data and computational modelling, especially neural network modelling.
We welcome any abstract, including tentative work, on the possibility of using neural and/or deep computational modelling for articulatory data. Suggestions for topics are:

–   Whether it is possible to capture invariants, language-independent predictable patterns that apply to all articulation
–    If transfer learning is possible, i.e. if a network trained on articulatory features in one speaker and, ultimately, language can be mapped onto the pattern of another speaker (or language)
–    If annotation of gestures can be aided by generating most likely gesture structures, analogous to the derivation of articulation from acoustics (e.g., Mitra, Vikramjit, et al. 2010)
–    If diagnostic classification is possible on networks that model articulation, analogous to e.g., the detection of counterparts to compositionality in a model of arithmetic grammar by Hupkes & Zuidema (2017)

Please use the link to EasyChair ( to submit abstracts; the deadline is 15 March. Tentative work is more than welcome! As for the main LabPhon conference, abstracts should be written in English and not exceed one page of text. References, examples and/or figures can optionally be included on a second page. Submitted abstracts must be in .pdf format, with Times New Roman font, size 12, 1 inch margins and single spacing. We do not require anonymous abstracts.

Website for details:

Contact Tom Lentz (