BERT masked language modeling for co-reference resolution

View/Open
Document typeConference report
Defense date2019
Rights accessOpen Access
Except where otherwise noted, content on this work
is licensed under a Creative Commons license
:
Attribution-NonCommercial-NoDerivs 3.0 Spain
Abstract
This paper explains the TALP-UPC participation for the Gendered Pronoun Resolution shared-task of the 1st ACL Workshop on Gender Bias for Natural Language Processing. We have implemented two models for mask language modeling using pre-trained BERT adjusted to work for a classification problem. The proposed solutions are based on the word probabilities of the original BERT model, but using common English names to replace the original test names
CitationAlfaro, F.; Ruiz, M.; Fonollosa, J. A. R. BERT masked language modeling for co-reference resolution. A: Workshop on Gender Bias in Natural Language Processing. "Proceedings of the First Workshop on Gender Bias in Natural Language Processing". 2019, p. 76-81.
Publisher versionhttps://www.aclweb.org/anthology/W19-3811
Files | Description | Size | Format | View |
---|---|---|---|---|
W19-3811.pdf | 523,2Kb | View/Open |