Enhancing sequence-to-sequence modeling for RDF triples to natural text

View/Open
Cita com:
hdl:2117/366257
Document typeConference report
Defense date2020
PublisherAssociation for Computational Linguistics
Rights accessOpen Access
This work is protected by the corresponding intellectual and industrial property rights.
Except where otherwise noted, its contents are licensed under a Creative Commons license
:
Attribution 3.0 Spain
Abstract
Establishes key guidelines on how, which and when Machine Translation (MT) techniques are worth applying to RDF-to-Text task. Not only do we apply and compare the most prominent MT architecture, the Transformer, but we also analyze state-of-the-art techniques such as Byte Pair Encoding or Back Translation to demonstrate an improvement in generalization. In addition, we empirically show how to tailor these techniques to enhance models relying on learned embeddings rather than using pretrained ones. Automatic metrics suggest that Back Translation can significantly improve model performance up to 7 BLEU points, hence, opening a window for surpassing state-of-the-art results with appropriate architectures.
CitationDomingo, O. [et al.]. Enhancing sequence-to-sequence modeling for RDF triples to natural text. A: WebNLG - International Workshop on Natural Language Generation from the Semantic Web. "Proceedings of the 3rd International Workshop on Natural Language Generation from the Semantic Web (WebNLG+)". Stroudsburg, PA: Association for Computational Linguistics, 2020, p. 40-47.
Publisher versionhttps://aclanthology.org/2020.webnlg-1.5/
Files | Description | Size | Format | View |
---|---|---|---|---|
2020.webnlg-1.5.pdf | 353,2Kb | View/Open |