Cross-dataset transfer learning for motor imagery signal classification via multi-task learning and pre-training

计算机科学 学习迁移 人工智能 机器学习 稳健性(进化) 运动表象 任务(项目管理) 多任务学习 模式识别(心理学) 脑电图 脑-机接口 心理学 生物化学 化学 管理 精神科 经济 基因
作者
Yuting Xie,Kun Wang,Jiayuan Meng,Yue Jin,Lin Meng,Weibo Yi,Tzyy‐Ping Jung,Minpeng Xu,Ming Dong
出处
期刊:Journal of Neural Engineering [IOP Publishing]
卷期号:20 (5): 056037-056037 被引量:23
标识
DOI:10.1088/1741-2552/acfe9c
摘要

Objective.Deep learning (DL) models have been proven to be effective in decoding motor imagery (MI) signals in Electroencephalogram (EEG) data. However, DL models' success relies heavily on large amounts of training data, whereas EEG data collection is laborious and time-consuming. Recently, cross-dataset transfer learning has emerged as a promising approach to meet the data requirements of DL models. Nevertheless, transferring knowledge across datasets involving different MI tasks remains a significant challenge in cross-dataset transfer learning, limiting the full utilization of valuable data resources.This study proposes a pre-training-based cross-dataset transfer learning method inspired by Hard Parameter Sharing in multi-task learning. Different datasets with distinct MI paradigms are considered as different tasks, classified with shared feature extraction layers and individual task-specific layers to allow cross-dataset classification with one unified model. Then, Pre-training and fine-tuning are employed to transfer knowledge across datasets. We also designed four fine-tuning schemes and conducted extensive experiments on them.The results showed that compared to models without pre-training, models with pre-training achieved a maximum increase in accuracy of 7.76%. Moreover, when limited training data were available, the pre-training method significantly improved DL model's accuracy by 27.34% at most. The experiments also revealed that pre-trained models exhibit faster convergence and remarkable robustness. The training time per subject could be reduced by up to 102.83 s, and the variance of classification accuracy decreased by 75.22% at best.This study represents the first comprehensive investigation of the cross-dataset transfer learning method between two datasets with different MI tasks. The proposed pre-training method requires only minimal fine-tuning data when applying DL models to new MI paradigms, making MI-Brain-computer interface more practical and user-friendly.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
赘婿应助UU采纳,获得10
刚刚
1秒前
整齐泥猴桃完成签到,获得积分10
1秒前
changhao6787给changhao6787的求助进行了留言
2秒前
2秒前
2秒前
1111发布了新的文献求助10
3秒前
科研通AI5应助大橙子采纳,获得10
3秒前
3秒前
许言关注了科研通微信公众号
4秒前
啦啦啦完成签到,获得积分10
4秒前
4秒前
5秒前
烟花应助小趴蔡采纳,获得10
5秒前
胖胖发布了新的文献求助10
5秒前
DAGH完成签到,获得积分20
6秒前
Orange应助海鸥跳海采纳,获得10
6秒前
Owen应助蜜CC采纳,获得10
6秒前
6秒前
Owen应助岗岗采纳,获得10
7秒前
所所应助爱科研的龙采纳,获得10
8秒前
小二郎应助helium采纳,获得10
8秒前
8秒前
鹿死完成签到,获得积分20
8秒前
所所应助安静的小伙采纳,获得10
9秒前
Jackson完成签到,获得积分10
9秒前
bkagyin应助july九月采纳,获得10
10秒前
热心路人发布了新的文献求助20
10秒前
落后南烟发布了新的文献求助10
10秒前
今后应助如意白亦采纳,获得10
10秒前
甜蜜的阿飞完成签到,获得积分10
10秒前
11秒前
育三杯清栀完成签到 ,获得积分10
11秒前
11秒前
鹿死发布了新的文献求助10
11秒前
麒煜完成签到 ,获得积分10
12秒前
zzc完成签到 ,获得积分10
13秒前
13秒前
13秒前
Ava应助小张采纳,获得10
14秒前
高分求助中
Continuum Thermodynamics and Material Modelling 3000
Production Logging: Theoretical and Interpretive Elements 2700
Kelsen’s Legacy: Legal Normativity, International Law and Democracy 1000
The Laschia-complex (Basidiomycetes) 600
Interest Rate Modeling. Volume 2: Term Structure Models 600
Dynamika przenośników łańcuchowych 600
Conference Record, IAS Annual Meeting 1977 510
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 有机化学 生物化学 物理 纳米技术 计算机科学 内科学 化学工程 复合材料 基因 遗传学 物理化学 催化作用 量子力学 光电子学 冶金
热门帖子
关注 科研通微信公众号,转发送积分 3539974
求助须知:如何正确求助?哪些是违规求助? 3117517
关于积分的说明 9331271
捐赠科研通 2815252
什么是DOI,文献DOI怎么找? 1547491
邀请新用户注册赠送积分活动 720990
科研通“疑难数据库(出版商)”最低求助积分说明 712395