Ir al contenido (pulsa Retorno)

Universitat Politècnica de Catalunya

    • Català
    • Castellano
    • English
    • LoginRegisterLog in (no UPC users)
  • mailContact Us
  • world English 
    • Català
    • Castellano
    • English
  • userLogin   
      LoginRegisterLog in (no UPC users)

UPCommons. Global access to UPC knowledge

58.843 UPC E-Prints
You are here:
View Item 
  •   DSpace Home
  • E-prints
  • Departaments
  • Departament de Ciències de la Computació
  • Ponències/Comunicacions de congressos
  • View Item
  •   DSpace Home
  • E-prints
  • Departaments
  • Departament de Ciències de la Computació
  • Ponències/Comunicacions de congressos
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Syntax-driven iterative expansion language models for controllable text generation

Thumbnail
View/Open
2020.spnlp-1.1.pdf (355,2Kb)
Share:
 
  View Usage Statistics
Cita com:
hdl:2117/341494

Show full item record
Casas Manzanares, Noé
Rodríguez Fonollosa, José AdriánMés informacióMés informacióMés informació
Ruiz Costa-Jussà, MartaMés informacióMés informació
Document typeConference lecture
Defense date2020
PublisherAssociation for Computational Linguistics
Rights accessOpen Access
Attribution 4.0 International
Except where otherwise noted, content on this work is licensed under a Creative Commons license : Attribution 4.0 International
ProjectAUTONOMOUS LIFELONG LEARNING INTELLIGENT SYSTEMS (AEI-PCIN-2017-079)
ARQUITECTURAS AVANZADAS DE APRENDIZAJE PROFUNDO APLICADAS AL PROCESADO DE VOZ, AUDIO Y LENGUAJE (AEI-PID2019-107579RB-I00)
Abstract
The dominant language modeling paradigm handles text as a sequence of discrete tokens. While that approach can capture the latent structure of the text, it is inherently constrained to sequential dynamics for text generation. We propose a new paradigm for introducing a syntactic inductive bias into neural text generation, where the dependency parse tree is used to drive the Transformer model to generate sentences iteratively. Our experiments show that this paradigm is effective at text generation, with quality between LSTMs and Transformers, and comparable diversity, requiring less than half their decoding steps, and its generation process allows direct control over the syntactic constructions of the generated text, enabling the induction of stylistic variations.
CitationCasas, N.; Fonollosa, J.A.R.; Costa-jussà, M.R. Syntax-driven iterative expansion language models for controllable text generation. A: Conference on Empirical Methods in Natural Language Processing. "EMNLP 2020, Structured Prediction for NLP: proceedings of the Fourth Workshop: November 20, 2020". Stroudsburg, PA: Association for Computational Linguistics, 2020, p. 1-10. ISBN 978-1-952148-83-5. 
URIhttp://hdl.handle.net/2117/341494
ISBN978-1-952148-83-5
Publisher versionhttps://www.aclweb.org/anthology/2020.spnlp-1.1/
Collections
  • Departament de Ciències de la Computació - Ponències/Comunicacions de congressos [1.219]
  • Doctorat en Teoria del Senyal i Comunicacions - Ponències/Comunicacions de congressos [183]
  • VEU - Grup de Tractament de la Parla - Ponències/Comunicacions de congressos [436]
  • Departament de Teoria del Senyal i Comunicacions - Ponències/Comunicacions de congressos [3.213]
Share:
 
  View Usage Statistics

Show full item record

FilesDescriptionSizeFormatView
2020.spnlp-1.1.pdf355,2KbPDFView/Open

Browse

This CollectionBy Issue DateAuthorsOther contributionsTitlesSubjectsThis repositoryCommunities & CollectionsBy Issue DateAuthorsOther contributionsTitlesSubjects

© UPC Obrir en finestra nova . Servei de Biblioteques, Publicacions i Arxius

info.biblioteques@upc.edu

  • About This Repository
  • Contact Us
  • Send Feedback
  • Inici de la pàgina