TCG-Event: Effective Task Conditioning for Generation-based Event Extraction

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

2 Citations (Scopus)

Abstract

Event extraction is an important but challenging task. Many existing techniques decompose it into subtasks of event and argument detection/classification, which are themselves complex structured prediction problems. Generation-based extraction techniques lessen the complexity of the problem formulation and are able to leverage the reasoning capabilities of large pre-trained language models. However, the large diversity of available event types makes it hard for generative models to effectively select the correct corresponding templates to predict the structured sequence. In this paper, we propose a task-conditioned generation-based event extraction model, TCG-Event, that addresses these challenges. A key contribution of TCG-Event is a novel task conditioning technique that injects event name information as prefixes into each layer of an encoder-decoder-based language model, thus enabling effective supervised learning. Our experiments on two benchmark datasets demonstrate the strong performance of our TCG-Event model.

Original languageEnglish
Title of host publicationProceedings of the 20th Workshop of the Australasian Language Technology Association
EditorsDavid Powers, Jennifer Biggs, Pradeesh Parameswaran
Place of PublicationStroudsburg PA USA
PublisherAssociation for Computational Linguistics (ACL)
Number of pages9
Publication statusPublished - 2022
EventAustralasian Language Technology Association Workshop 2022 - Adelaide, Australia
Duration: 14 Dec 202216 Dec 2022
Conference number: 20th
https://aclanthology.org/volumes/2022.alta-1/ (Proceedings)
https://alta2022.alta.asn.au/ (Website)

Conference

ConferenceAustralasian Language Technology Association Workshop 2022
Abbreviated titleALTA 2022
Country/TerritoryAustralia
CityAdelaide
Period14/12/2216/12/22
Internet address

Cite this