The Context-Dependent Additive Recurrent Neural Net

Hung Quan Tran, Tuan Manh Lai, Gholamreza Haffari, Ingrid Zukerman, Trung Bui, Hung Bui

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review


Contextual sequence mapping is one of the fundamental problems in Natural Language Processing. Instead of relying solely on the information presented in a text, the learning agents have access to a strong external signal given to assist the learning process. In this paper, we propose a novel family of Recurrent Neural Network unit: the Context-dependent Additive Recurrent Neural Network (CARNN) that is designed specifically to leverage this external signal. The experimental results on pubic datasets in the dialog problem (Babi dialog Task 6 and Frame), contextual language model (Switchboard and Penn Discourse Tree Bank) and question answering (TrecQA) show that our novel CARNN-based architectures outperform previous methods.
Original languageEnglish
Title of host publicationConference of the North American Chapter of the Association for Computational Linguistics
Subtitle of host publicationHuman Language Technologies (NAACL HLT 2018)
EditorsHeng Ji, Amanda Stent
Place of PublicationRed Hook NY USA
PublisherAssociation for Information Systems
Number of pages10
ISBN (Electronic)9781510863460
Publication statusPublished - 2018
EventNorth American Association for Computational Linguistics 2018: Human Language Technologies - Hyatt Regency, New Orleans, United States of America
Duration: 1 Jun 20186 Jun 2018
Conference number: 16th (Proceedings )


ConferenceNorth American Association for Computational Linguistics 2018
Abbreviated titleNAACL HLT 2018
Country/TerritoryUnited States of America
CityNew Orleans
Internet address


  • sequence mapping

Cite this