Graph-to-sequence learning using Gated Graph Neural Networks

Daniel Beck, Gholamreza Haffari, Trevor Cohn

    Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

    49 Citations (Scopus)


    Many NLP applications can be framed as a graph-to-sequence learning problem.
    Previous work proposing neural architectures on this setting obtained promising
    results compared to grammar-based approaches but still rely on linearisation
    heuristics and/or standard recurrent networks to achieve the best performance.
    In this work, we propose a new model that encodes the full structural information contained in the graph. Our architecture couples the recently proposed Gated Graph Neural Networks with an input transformation that allows nodes and edges to have their own hidden representations, while tackling the parameter explosion problem present in previous work. Experimental results show that our model outperforms strong baselines in generation from AMR graphs and syntax-based neural machine translation.
    Original languageEnglish
    Title of host publicationACL 2018 - The 56th Annual Meeting of the Association for Computational Linguistics
    Subtitle of host publicationProceedings of the Conference, Vol. 1 (Long Papers)
    EditorsIryna Gurevych, Yusuke Miyao
    Place of PublicationStroudsburg PA USA
    PublisherAssociation for Computational Linguistics (ACL)
    Number of pages11
    ISBN (Print)9781948087322
    Publication statusPublished - 2018
    EventAnnual Meeting of the Association of Computational Linguistics 2018 - Melbourne, Australia
    Duration: 15 Jul 201820 Jul 2018
    Conference number: 56th


    ConferenceAnnual Meeting of the Association of Computational Linguistics 2018
    Abbreviated titleACL 2018
    Internet address

    Cite this