Harmonic Grammar: Models and Methods

Instructor(s): Joe Pater

Harmonic Grammar was first introduced in 1990 by Legendre, Miyata, and Smolensky as a fusion of connectionism and generative linguistics. There has recently been a revival of interest in weighted constraint grammars, whose formalisms also draw on explicitly probabilistic models (e.g. Maximum Entropy Grammar), and a number of software tools have been developed for their exploration. Starting with a version of HG that closely resembles the well-known "classic" OT model of Prince and Smolensky (1993/2004), this course then proceeds to introduce more elaborate probabilistic models, as well as versions of HG that use serial derivations (as Harmonic Serialism - McCarthy 2007 et seq.). It also shows how HG learning algorithms can be used to model human language acquisition, and how they can be applied to simulations of language change through iterated "agent-based" learning. Throughout, students will learn how to use associated software tools to work with the theories.

No background in mathematics or computational modeling will be assumed. Some acquaintance with Optimality Theory will be assumed, though students with strengths in computational work who lack this background should be able to follow the course.

Course ID:

Mon & Thu 8:30-10:15

Classroom: ATLAS 105

Areas of Linguistics:
Phonetics, Phonology and Morphology
Computational Linguistics