赫比理论
竞争性学习
连接主义
计算机科学
人工智能
无监督学习
利布拉
不变(物理)
集合(抽象数据类型)
差异(会计)
光学(聚焦)
特征学习
人工神经网络
机器学习
模式识别(心理学)
数学
泛化误差
唤醒睡眠算法
程序设计语言
业务
会计
物理
光学
数学物理
作者
Nicol N. Schraudolph,Terrence J. Sejnowski
出处
期刊:Neural Information Processing Systems
日期:1991-12-02
卷期号:4: 1017-1024
被引量:10
摘要
Although the detection of invariant structure in a given set of input patterns is vital to many recognition tasks, connectionist learning rules tend to focus on directions of high variance (principal components). The prediction paradigm is often used to reconcile this dichotomy; here we suggest a more direct approach to invariant learning based on an anti-Hebbian learning rule. An unsupervised two-layer network implementing this method in a competitive setting learns to extract coherent depth information from random-dot stereograms.
科研通智能强力驱动
Strongly Powered by AbleSci AI