All proceedings
Enter a document #:
Enter search terms:

Info for readers Info for authors Info for editors Info for libraries Order form Shopping cart

Bookmark and Share Paper 1467

Using Entropy to Learn OT Grammars from Surface Forms Alone
Jason Riggle
346-353 (complete paper or proceedings contents)

Abstract

The problem of ranking Optimality Theoretic constraints in a fashion that is consistent with a given set of (input, output) pairs has been solved with a variety of algorithms (cf. Tesar and Smolensky 2002, Boersma and Hayes 1999). The real-world problem of learning from outputs alone, however, still presents a host of challenges. Chief among these is the fact that there are often several (possible-input, possible-grammar) pairs that are consistent with a given set of surface forms. This paper explores strategies for learning from surface forms that use information-theoretic measures of the randomness (entropy) of the input set associated with each grammar hypothesis as a heuristic to select the grammars that maximally encode any observed patterns. This represents a straightforward use of the Richness of the Base Hypotheses (Smolensky 1996) to avoid encoding observed patterns as accidental patterns in the lexicon.

Published in

Proceedings of the 25th West Coast Conference on Formal Linguistics
edited by Donald Baumer, David Montero, and Michael Scanlon
Table of contents
Printed edition: $375.00