Teaching Neural Module Networks to do arithmetic

Jiayi Chen, Xiao-Yu Guo, Yuan-Fang Li, Gholamreza Haffari

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

Abstract

Answering complex questions that require multi-step multi-type reasoning over raw text is challenging, especially when conducting numerical reasoning. Neural Module Networks (NMNs), follow the programmer-interpreter framework and design trainable modules to learn different reasoning skills. However, NMNs only have limited reasoning abilities, and lack numerical reasoning capability. We upgrade NMNs by: (a) bridging the gap between its interpreter and the complex questions; (b) introducing addition and subtraction modules that perform numerical reasoning over numbers. On a subset of DROP, experimental results show that our proposed methods enhance NMNs’ numerical reasoning skills by 17.7% improvement of F1 score and significantly outperform previous state-of-the-art models.

Original languageEnglish
Title of host publicationProceedings of the Main Conference - The 29th International Conference on Computational Linguistics
EditorsHansaem Kim, James Pustejovsky, Leo Wanner
Place of PublicationStroudsburg PA USA
PublisherAssociation for Computational Linguistics (ACL)
Pages1502-1510
Number of pages9
Volume29
Edition1
Publication statusPublished - 2022
EventInternational Conference on Computational Linguistics 2022 - Gyeongju, Korea, South
Duration: 12 Oct 202217 Oct 2022
Conference number: 29th
https://coling2022.org/
https://aclanthology.org/volumes/2022.coling-1/ (Proceedings)

Conference

ConferenceInternational Conference on Computational Linguistics 2022
Abbreviated titleCOLING
Country/TerritoryKorea, South
CityGyeongju
Period12/10/2217/10/22
Internet address

Cite this