Various authors have recently endorsed Harmonic Grammar (HG) as a replacement for Optimality Theory (OT). One argument for this move is based on computational considerations: OT looks prima facie like an exotic framework with no correspondent in Machine Learning, and its replacement with HG allows methods and results from Machine Learning to be imported into Computational Phonology. This paper shows that this argument in favor of HG and against OT is wrong: algorithms for HG can be rather trivially ported into OT and HG thus has no computational advantages over OT. This simple result is significant because it extends the current toolkit for computational OT with algorithmic tools from Machine Learning that were so far considered unfit for OT. The fruitfulness of this new approach to Computational OT is illustrated by showing that it leads to a convergence proof for a variant of Boersma's (1997, 1998) (non-stochastic) GLA, based on convergence for the HG Perceptron Algorithm.
Proceedings of the 29th West Coast Conference on Formal Linguistics
edited by Jaehoon Choi, E. Alan Hogue, Jeffrey Punske, Deniz Tat, Jessamyn Schertz, and Alex Trueman Table of contents
ISBN 978-1-57473-451-5 library binding
viii + 406 pages
publication date: 2012
published by Cascadilla Proceedings Project, Somerville, MA, USA