Abstract:
Online learning session exit prediction aims to accurately predict learning session exit during the online learning process, which is a very important research task in the field of smart education. To address the issue of low prediction accuracy of existing models in small samples scenarios, a prefix-tuning based online learning session dropout prediction model, Prefix-LSDPM, is proposed. This model concatenates the prompt vector as a prefix with the sequence of learned behavioral features. It fixes the pre-trained parameter weights and only trains the iterative prefix hint parameters. In order to obtain the internal characteristics of individual learning behaviors of learners and the implicit correlation between continuous learning behaviors, this model performs mask learning on the synthesized sequences in the form of prompts in a Transformer network with improved key-value vectors. Experiments are conducted based on three pre-trained models (BERT, ALBERT, and UniLM) and three datasets (EdNet, XuetangX 1 and XuetangX 2) According to the ablation experiment, based on the parameters of three pre-training models, the length of the prompt sequence that can achieve the best predictive performance for Prefix-LSDPM is 3 tokens. The Prefix-LSDPM based on ALBERT has the best prediction performance. AUC can reach 90.65%, which is 9.29% higher than the existing best-performing model. Based on the optimal length of the prompt sequence, a comparison is made between fine-tuning and Prefix-LSDPM based on three pre-training models. The experimental results via multiple datasets show that the prediction accuracy of Prefix-LSDPM is superior to existing models, and the AUC is increased by 5.19%, compared with the fine-tuning method. In the small sample performance research experiment, the AUC trained with Prefix-LSDPM on 1% of the training set samples can still reach 86.95%. It is illustrated that Prefix-LSDPM can achieve advanced prediction performance in small sample learning.