计算机科学
人工智能
鉴定(生物学)
认知
深度学习
机器学习
数据科学
心理学
植物
神经科学
生物
作者
Zhi Liu,Xi Kong,Hao Chen,Sannyuya Liu,Zongkai Yang
出处
期刊:IEEE Transactions on Learning Technologies
[Institute of Electrical and Electronics Engineers]
日期:2023-01-31
卷期号:16 (4): 528-542
被引量:12
标识
DOI:10.1109/tlt.2023.3240715
摘要
In a massive open online courses (MOOCs) learning environment, it is essential to understand students' social knowledge constructs and critical thinking for instructors to design intervention strategies. The development of social knowledge constructs and critical thinking can be represented by cognitive presence, which is a primary component of the community of inquiry model. However, identifying learners' cognitive presence is a challenging problem, and most researchers have performed this task using traditional machine learning methods that require both manual feature construction and adequate labeled data. In this article, we present a novel variant of the bidirectional encoder representations from transformers (BERT) model for cognitive presence identification, namely MOOC-BERT, which is pretrained on large-scale unlabeled discussion data collected from various MOOCs involving different disciplines. MOOC-BERT learned deep representations of unlabeled data and adopted Chinese characters as inputs without any feature engineering. The experimental results showed that MOOC-BERT outperformed the representative machine learning algorithms and deep learning models in the performance of identification and cross-course generalization. Then, MOOC-BERT was adopted to identify the unlabeled posts of the two courses. The empirical analysis results revealed the evolution and differences in MOOC learners' cognitive presence levels. These findings provide valuable insights into the effectiveness of pretraining on large-scale and multidiscipline discussion data in facilitating accurate cognitive presence identification, demonstrating the practical value of MOOC-BERT in learning analytics.
科研通智能强力驱动
Strongly Powered by AbleSci AI