Arrow Research search
Back to AAAI

AAAI 2020

Towards Minimal Supervision BERT-Based Grammar Error Correction (Student Abstract)

Short Paper Student Abstract Track Artificial Intelligence

Abstract

Current grammatical error correction (GEC) models typically consider the task as sequence generation, which requires large amounts of annotated data and limit the applications in datalimited settings. We try to incorporate contextual information from pre-trained language model to leverage annotation and benefit multilingual scenarios. Results show strong potential of Bidirectional Encoder Representations from Transformers (BERT) in grammatical error correction task.

Authors

Keywords

No keywords are indexed for this paper.

Context

Venue
AAAI Conference on Artificial Intelligence
Archive span
1980-2026
Indexed papers
28718
Paper id
615959756878849474