Quarterly (winter, spring, summer, fall)
224 pp. per issue
6 3/4 x 9 1/4
2014 Impact factor:

Linguistic Inquiry

Fall 2009, Vol. 40, No. 4, Pages 667-686
(doi: 10.1162/ling.2009.40.4.667)
© 2009 Massachusetts Institute of Technology
Some Correct Error-Driven Versions of the Constraint Demotion Algorithm
Article PDF (171.17 KB)

This article shows that Error-Driven Constraint Demotion (EDCD), an error-driven learning algorithm proposed by Tesar (1995) for Prince and Smolensky's (1993/2004) version of Optimality Theory, can fail to converge to a correct totally ranked hierarchy of constraints, unlike the earlier non-error-driven learning algorithms proposed by Tesar and Smolensky (1993). The cause of the problem is found in Tesar's use of “mark-pooling ties,” indicating that EDCD can be repaired by assuming Anttila's (1997) “permuting ties” instead. Proofs show, and simulations confirm, that totally ranked hierarchies can indeed be found by both this repaired version of EDCD and Boersma's (1998) Minimal Gradual Learning Algorithm.