The Context-Dependent Additive Recurrent Neural Net

Hung Quan Tran, Tuan Manh Lai, Gholamreza Haffari, Ingrid Zukerman, Trung Bui, Hung Bui

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

Abstract

Contextual sequence mapping is one of the fundamental problems in Natural Language Processing. Instead of relying solely on the information presented in a text, the learning agents have access to a strong external signal given to assist the learning process. In this paper, we propose a novel family of Recurrent Neural Network unit: the Context-dependent Additive Recurrent Neural Network (CARNN) that is designed specifically to leverage this external signal. The experimental results on pubic datasets in the dialog problem (Babi dialog Task 6 and Frame), contextual language model (Switchboard and Penn Discourse Tree Bank) and question answering (TrecQA) show that our novel CARNN-based architectures outperform previous methods.
Original languageEnglish
Title of host publicationConference of the North American Chapter of the Association for Computational Linguistics
Subtitle of host publicationHuman Language Technologies (NAACL HLT 2018)
EditorsHeng Ji, Amanda Stent
Place of PublicationRed Hook NY USA
PublisherAssociation for Information Systems
Pages1274-1283
Number of pages10
Volume1
ISBN (Electronic)9781510863460
Publication statusPublished - 2018
EventNorth American Association for Computational Linguistics 2018: Human Language Technologies - Hyatt Regency, New Orleans, United States of America
Duration: 1 Jun 20186 Jun 2018
Conference number: 16th
http://naacl2018.org/
https://www.aclweb.org/anthology/volumes/N18-1/ (Proceedings )

Conference

ConferenceNorth American Association for Computational Linguistics 2018
Abbreviated titleNAACL HLT 2018
Country/TerritoryUnited States of America
CityNew Orleans
Period1/06/186/06/18
Internet address

Keywords

  • sequence mapping

Cite this