计算机科学
机制(生物学)
人工智能
脑-机接口
主题(文档)
认知科学
人机交互
机器学习
神经科学
万维网
脑电图
心理学
哲学
认识论
作者
Aigerim Keutayeva,Berdakh Abibullaev
标识
DOI:10.1007/978-3-031-53827-8_23
摘要
This research examines the employment of attention mechanism driven deep learning models for building subject-independent Brain-Computer Interfaces (BCIs). The research evaluated three different attention models using the Leave-One-Subject-Out cross-validation method. The results showed that the Hybrid Temporal CNN and ViT model performed well on the BCI competition IV 2a dataset, achieving the highest average accuracy and outperforming other models for 5 out of 9 subjects. However, this model did not perform the best on the BCI competition IV 2b dataset when compared to other methods. One of the challenges faced was the limited size of the data, especially for transformer models that require large amounts of data, which affected the performance variability between datasets. This study highlights a beneficial approach to designing BCIs, combining attention mechanisms with deep learning to extract important inter-subject features from EEG data while filtering out irrelevant signals.
科研通智能强力驱动
Strongly Powered by AbleSci AI