人类连接体项目
计算机科学
功能磁共振成像
人工智能
背景(考古学)
任务(项目管理)
语言模型
循环神经网络
人工神经网络
模式识别(心理学)
机器学习
语音识别
功能连接
神经科学
经济
管理
古生物学
生物
作者
Mengshen He,Xiangyu Hou,Zhenwei Wang,Zili Kang,Xin Zhang,Qiang Ning,Bao Ge
标识
DOI:10.1007/978-3-031-16431-6_28
摘要
It has been of great interest in the neuroimaging community to discover brain functional networks (FBNs) based on task functional magnetic resonance imaging (tfMRI). A variety of methods have been used to model tfMRI sequences so far, such as recurrent neural network (RNN) and Autoencoder. However, these models are not designed to incorporate the characteristics of tfMRI sequences, and the same signal values at different time points in a fMRI time series may rep-resent different states and meanings. Inspired by cloze learning methods and the human ability to judge polysemous words based on context, we proposed a self-supervised a Multi-head Attention-based Masked Sequence Model (MAMSM), as BERT model uses (Masked Language Modeling) MLM and multi-head attention to learn the different meanings of the same word in different sentences. MAMSM masks and encodes tfMRI time series, uses multi-head attention to calculate different meanings corresponding to the same signal value in fMRI sequence, and obtains context information through MSM pre-training. Furthermore this work redefined a new loss function to extract FBNs according to the task de-sign information of tfMRI data. The model has been applied to the Human Connectome Project (HCP) task fMRI dataset and achieves state-of-the-art performance in brain temporal dynamics, the Pearson correlation coefficient between learning features and task design curves was more than 0.95, and the model can extract more meaningful network besides the known task related brain networks.
科研通智能强力驱动
Strongly Powered by AbleSci AI