A Little Pretraining Goes a Long Way: A Case Study on Dependency Parsing Task for Low-resource Morphologically Rich Languages

Jivnesh Sandhan, Amrith Krishna, Ashim Gupta, Laxmidhar Behera, Pawan Goyal

Student Research Workshop Long paper Paper

Gather-2F: Apr 22, Gather-2F: Apr 22 (13:00-15:00 UTC) [Join Gather Meeting]

Abstract: Neural dependency parsing has achieved remarkable performance for many domains and languages. The bottleneck of massive labelled data limits the effectiveness of these approaches for low resource languages. In this work, we focus on dependency parsing for morphological rich languages (MRLs) in a low-resource setting. Although morphological information is essential for the dependency parsing task, the morphological disambiguation and lack of powerful analyzers pose challenges to get this information for MRLs. To address these challenges, we propose simple auxiliary tasks for pretraining. We perform experiments on 10 MRLs in low-resource settings to measure the efficacy of our proposed pretraining method and observe an average absolute gain of 2 points (UAS) and 3.6 points (LAS).

Connected Papers in EACL2021

Similar Papers

MTOP: A Comprehensive Multilingual Task-Oriented Semantic Parsing Benchmark
Haoran Li, Abhinav Arora, Shuohui Chen, Anchit Gupta, Sonal Gupta, Yashar Mehdad,
On the evolution of syntactic information encoded by BERT's contextualized representations
Laura Pérez-Mayos, Roberto Carlini, Miguel Ballesteros, Leo Wanner,
Few-Shot Semantic Parsing for New Predicates
Zhuang Li, Lizhen Qu, shuo huang, Gholamreza Haffari,