计算机科学
人工智能
变压器
杠杆(统计)
人工神经网络
机器学习
嵌入
标记数据
训练集
安全性令牌
模式识别(心理学)
工程类
电压
电气工程
计算机安全
作者
Jinlong Hu,Yangmin Huang,Nan Wang,Shoubin Dong
出处
期刊:IEEE Transactions on Neural Systems and Rehabilitation Engineering
[Institute of Electrical and Electronics Engineers]
日期:2024-01-01
卷期号:32: 2727-2736
被引量:2
标识
DOI:10.1109/tnsre.2024.3434343
摘要
Deep learning methods have advanced quickly in brain imaging analysis over the past few years, but they are usually restricted by the limited labeled data. Pre-trained model on unlabeled data has presented promising improvement in feature learning in many domains, such as natural language processing. However, this technique is under-explored in brain network analysis. In this paper, we focused on pre-training methods with Transformer networks to leverage existing unlabeled data for brain functional network classification. First, we proposed a Transformer-based neural network, named as BrainNPT, for brain functional network classification. The proposed method leveraged token as a classification embedding vector for the Transformer model to effectively capture the representation of brain networks. Second, we proposed a pre-training framework for BrainNPT model to leverage unlabeled brain network data to learn the structure information of brain functional networks. The results of classification experiments demonstrated the BrainNPT model without pre-training achieved the best performance with the state-of-the-art models, and the BrainNPT model with pre-training strongly outperformed the state-of-the-art models. The pre-training BrainNPT model improved 8.75% of accuracy compared with the model without pre-training. We further compared the pre-training strategies and the data augmentation methods, analyzed the influence of the parameters of the model, and explained the trained model.
科研通智能强力驱动
Strongly Powered by AbleSci AI