Improving numerical reasoning skills in the modular approach for complex question answering on text

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

3 Citations (Scopus)

Abstract

Numerical reasoning skills are essential for complex question answering (CQA) over text. It requires opertaions including counting, comparison, addition and subtraction. A successful approach to CQA on text, Neural Module Networks (NMNs), follows the programmer-interpreter paradigm and leverages specialised modules to perform compositional reasoning. However, the NMNs framework does not consider the relationship between numbers and entities in both questions and paragraphs. We propose effective techniques to improve NMNs' numerical reasoning capabilities by making the interpreter questionaware and capturing the relationship between entities and numbers. On the same subset of the DROP dataset for CQA on text, experimental results show that our additions outperform the original NMNs by 3.0 points for the overall F1 score.

Original languageEnglish
Title of host publicationFindings of the Association for Computational Linguistics: EMNLP 2021
EditorsXuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Place of PublicationStroudsburg PA USA
PublisherAssociation for Computational Linguistics (ACL)
Pages2713–2718
Number of pages6
ISBN (Electronic)9781955917100
Publication statusPublished - 2021
EventEmpirical Methods in Natural Language Processing 2021 - Online, Punta Cana, Dominican Republic
Duration: 7 Nov 202111 Nov 2021
https://2021.emnlp.org/ (Website)
https://aclanthology.org/2021.emnlp-main.0/ (Proceedings)
https://aclanthology.org/2021.findings-emnlp.0/ (Proceedings - findings)

Conference

ConferenceEmpirical Methods in Natural Language Processing 2021
Abbreviated titleEMNLP 2021
Country/TerritoryDominican Republic
CityPunta Cana
Period7/11/2111/11/21
Internet address

Cite this