Revealing the Myth of Higher-Order Inference in Coreference Resolution

Liyan Xu, Jinho D. Choi


Abstract

This paper analyzes the impact of higher-order inference (HOI) on the task of coreference resolution. HOI has been adapted by almost all recent coreference resolution models without taking much investigation on its true effectiveness over representation learning. To make a comprehensive analysis, we implement an end-to-end coreference system as well as four HOI approaches, attended antecedent, entity equalization, span clustering, and cluster merging, where the latter two are our original methods. We find that given a high-performing encoder such as SpanBERT, the impact of HOI is negative to marginal, providing a new perspective of HOI to this task. Our best model using cluster merging shows the Avg-F1 of 80.2 on the CoNLL 2012 shared task dataset in English.

Venue / Year

Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP) / 2020

Links

Anthology | Paper | Presentation | BibTeX | GitHub