Jingle BERT, Jingle BERT, Frozen All the Way: Freezing Layers to Identify CEFR Levels of Second Language Learners Using BERT

Authors

  • Ricardo  Muñoz Sánchez University of Gothenburg
  • David Alfter University of Gothenburg
  • Simon Dobnik University of Gothenburg
  • Maria Szawerna University of Gothenburg
  • Elena Volodina University of Gothenburg

DOI:

https://doi.org/10.3384/ecp211011

Keywords:

large language models, automated essay scoring

Abstract

In this paper, we investigate the question of how much domain adaptation is needed for the task of automatic essay assessment by freezing layers in BERT models. We test our methodology on three different graded language corpora (English, French and Swedish) and find that partially fine-tuning base models improves performance over fully fine-tuning base models, although the number of layers to freeze differs by language. We also look at the effect of freezing layers on different grades in the corpora and find that different layers are important for different grade levels. Finally, our results represent a new state-of-the-art in automatic essay classification for the three languages under investigation.

Downloads

Published

2024-10-15