Abstract
Event extraction is an important but challenging task. Many existing techniques decompose it into subtasks of event and argument detection/classification, which are themselves complex structured prediction problems. Generation-based extraction techniques lessen the complexity of the problem formulation and are able to leverage the reasoning capabilities of large pre-trained language models. However, the large diversity of available event types makes it hard for generative models to effectively select the correct corresponding templates to predict the structured sequence. In this paper, we propose a task-conditioned generation-based event extraction model, TCG-Event, that addresses these challenges. A key contribution of TCG-Event is a novel task conditioning technique that injects event name information as prefixes into each layer of an encoder-decoder-based language model, thus enabling effective supervised learning. Our experiments on two benchmark datasets demonstrate the strong performance of our TCG-Event model.
Original language | English |
---|---|
Title of host publication | Proceedings of the 20th Workshop of the Australasian Language Technology Association |
Editors | David Powers, Jennifer Biggs, Pradeesh Parameswaran |
Place of Publication | Stroudsburg PA USA |
Publisher | Association for Computational Linguistics (ACL) |
Number of pages | 9 |
Publication status | Published - 2022 |
Event | Australasian Language Technology Association Workshop 2022 - Adelaide, Australia Duration: 14 Dec 2022 → 16 Dec 2022 Conference number: 20th https://aclanthology.org/volumes/2022.alta-1/ (Proceedings) https://alta2022.alta.asn.au/ (Website) |
Conference
Conference | Australasian Language Technology Association Workshop 2022 |
---|---|
Abbreviated title | ALTA 2022 |
Country/Territory | Australia |
City | Adelaide |
Period | 14/12/22 → 16/12/22 |
Internet address |
|