It Depends: Dependency Parser Comparison Using A Web-based Evaluation Tool

Jinho D. Choi, Joel Tetreault, Amanda Stent


Abstract

The last few years have seen a surge in the number of accurate, fast, publicly available dependency parsers. At the same time, the use of dependency parsing in NLP applications has increased. It can be difficult for a non-expert to select a good ``off-the-shelf'' parser. We present a comparative analysis of ten leading statistical dependency parsers on a multi-genre corpus of English. For our analysis, we developed a new web-based tool that gives a convenient way of comparing dependency parser outputs. Our analysis will help practitioners choose a parser to optimize their desired speed/accuracy trade-off, and our tool will help practitioners examine and compare parser output.

Venue / Year

Proceedings of the Annual Meeting of the Association for Computational Linguistics (ACL) / 2015

Links

Anthology | Paper | Presentation | BibTeX