Topic

The Role of Alignment of Multilingual Contextualized Embeddings in Zeroshot Cross-Lingual Transfer for Event Extraction

Karen Hambardzumyan

 

K. Hambardzumyan, H. Khachatrian, and J. May 

 

  Abstract:

 

 Contextualized word embeddings like BERT enabled signifi- cant advances in many natural language processing tasks. Recently, mul- tilingual versions of such embeddings were trained on large text corpora of more than 100 languages. In this paper we investigate how well such embeddings perform in zero-shot cross lingual transfer for an event ex- traction task. In particular, we analyze the impact of the alignment of contextualized word embeddings using a parallel corpus on the perfor- mance of the downstream task.

 

Discussion Room: The Role of Alignment of Multilingual Contextualized Embeddings in Zeroshot Cross-Lingual Transfer for Event Extraction

[email protected]