Quarterly (March, June, September, December)
160 pp. per issue
6 3/4 x 10
2014 Impact factor:

Computational Linguistics

Paola Merlo, Editor
September 2005, Vol. 31, No. 3, Pages 289-296
(doi: 10.1162/089120105774321109)
© 2005 Association for Computational Linguistics
Evaluating Discourse and Dialogue Coding Schemes
Article PDF (52.47 KB)

Agreement statistics play an important role in the evaluation of coding schemes for discourse and dialogue. Unfortunately there is a lack of understanding regarding appropriate agreement measures and how their results should be interpreted. In this article we describe the role of agreement measures and argue that only chance-corrected measures that assume a common distribution of labels for all coders are suitable for measuring agreement in reliability studies. We then provide recommendations for how reliability should be inferred from the results of agreement statistics.