288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

July 2017, Vol. 27, No. 7, Pages 1964-1985
(doi: 10.1162/NECO_a_00970)
© 2017 Massachusetts Institute of Technology
A Customized Attention-Based Long Short-Term Memory Network for Distant Supervised Relation Extraction
Article PDF (525.63 KB)

Distant supervision, a widely applied approach in the field of relation extraction can automatically generate large amounts of labeled training corpus with minimal manual effort. However, the labeled training corpus may have many false-positive data, which would hurt the performance of relation extraction. Moreover, in traditional feature-based distant supervised approaches, extraction models adopt human design features with natural language processing. It may also cause poor performance. To address these two shortcomings, we propose a customized attention-based long short-term memory network. Our approach adopts word-level attention to achieve better data representation for relation extraction without manually designed features to perform distant supervision instead of fully supervised relation extraction, and it utilizes instance-level attention to tackle the problem of false-positive data. Experimental results demonstrate that our proposed approach is effective and achieves better performance than traditional methods.