修剪
计算机科学
召回
工作记忆
机制(生物学)
认知心理学
无损压缩
人工智能
机器学习
心理学
认知
神经科学
生物
数据压缩
认识论
哲学
农学
作者
Peter Shepherdson,Klaus Oberauer
摘要
Substantial behavioral evidence suggests that attention plays an important role in working memory. Frequently, attention is characterized as enhancing representations by increasing their strength or activation level. Despite the intuitive appeal of this idea, using attention to strengthen representations in computational models can lead to unexpected outcomes. Representational strengthening frequently leads to worse, rather than better, performance, contradicting behavioral results. Here, we propose an alternative to a pure strengthening account, in which attention is used to selectively strengthen useful and weaken less useful components of distributed memory representations, thereby pruning the representations. We use a simple sampling algorithm to implement this pruning mechanism in a computational model of working memory. Our simulations show that pruning representations in this manner leads to improvements in performance compared with a lossless (i.e., decay-free) baseline condition, for both discrete recall (e.g., of a list of words) and continuous reproduction (e.g., of an array of colors). Pruning also offers a potential explanation of why a retro-cue drawing attention to one memory item during the retention interval improves performance. These results indicate that a pruning mechanism could provide a viable alternative to pure strengthening accounts of attention to representations in working memory.
科研通智能强力驱动
Strongly Powered by AbleSci AI