神经质的
心理学
自闭症
多传感器集成
认知心理学
手势
听力学
自闭症谱系障碍
发展心理学
计算机科学
医学
感觉系统
计算机视觉
作者
Magdalena Matyjek,Sotaro Kita,Mireia Torralba Cuello,Salvador Soto‐Faraco
摘要
ABSTRACT Seeing the speaker often facilitates auditory speech comprehension through audio‐visual integration. This audio‐visual facilitation is stronger under challenging listening conditions, such as in real‐life social environments. Autism has been associated with atypicalities in integrating audio‐visual information, potentially underlying social difficulties in this population. The present study investigated multisensory integration (MSI) of audio‐visual speech information among autistic and neurotypical adults. Participants performed a speech‐in‐noise task in a realistic multispeaker social scenario with audio‐visual, auditory, or visual trials while their brain activity was recorded using EEG. The neurotypical group demonstrated a non‐linear audio‐visual effect in alpha oscillations, whereas the autistic group showed merely additive processing. Despite these differences in neural correlates, both groups achieved similar behavioral audio‐visual facilitation outcomes. These findings suggest that although autistic and neurotypical brains might process multisensory cues differently, they achieve comparable benefits from audio‐visual speech. These results contribute to the growing body of literature on MSI atypicalities in autism.
科研通智能强力驱动
Strongly Powered by AbleSci AI