Arrow Research search
Back to AAAI

AAAI 2023

Script, Language, and Labels: Overcoming Three Discrepancies for Low-Resource Language Specialization

Conference Paper AAAI Technical Track on Speech & Natural Language Processing Artificial Intelligence

Abstract

Although multilingual pretrained models (mPLMs) enabled support of various natural language processing in diverse languages, its limited coverage of 100+ languages lets 6500+ languages remain ‘unseen’. One common approach for an unseen language is specializing the model for it as target, by performing additional masked language modeling (MLM) with the target language corpus. However, we argue that, due to the discrepancy from multilingual MLM pretraining, a naive specialization as such can be suboptimal. Specifically, we pose three discrepancies to overcome. Script and linguistic discrepancy of the target language from the related seen languages, hinder a positive transfer, for which we propose to maximize representation similarity, unlike existing approaches maximizing overlaps. In addition, label space for MLM prediction can vary across languages, for which we propose to reinitialize top layers for a more effective adaptation. Experiments over four different language families and three tasks shows that our method improves the task performance of unseen languages with statistical significance, while previous approach fails to.

Authors

Keywords

  • SNLP: Language Models
  • SNLP: Learning & Optimization for SNLP
  • SNLP: Machine Translation & Multilinguality
  • SNLP: Syntax -- Tagging, Chunking & Parsing

Context

Venue
AAAI Conference on Artificial Intelligence
Archive span
1980-2026
Indexed papers
28718
Paper id
802350276169016162