连接主义
计算机科学
集合(抽象数据类型)
任务(项目管理)
代表(政治)
背景(考古学)
人工智能
安全性令牌
自然语言处理
语义记忆
理论计算机科学
认知
人工神经网络
心理学
古生物学
计算机安全
管理
神经科学
政治
政治学
法学
经济
生物
程序设计语言
标识
DOI:10.1016/0364-0213(90)90002-e
摘要
Time underlies many interesting human behaviors. Thus, the question of how to represent time in connectionist models is very important. One approach is to represent time implicitly by its effects on processing rather than explicitly (as in a spatial representation). The current report develops a proposal along these lines first described by Jordan (1986) which involves the use of recurrent links in order to provide networks with a dynamic memory. In this approach, hidden unit patterns are fed back to themselves: the internal representations which develop thus reflect task demands in the context of prior internal states. A set of simulations is reported which range from relatively simple problems (temporal version of XOR) to discovering syntactic/semantic features for words. The networks are able to learn interesting internal representations which incorporate task demands with memory demands: indeed, in this approach the notion of memory is inextricably bound up with task processing. These representations reveal a rich structure, which allows them to be highly context-dependent, while also expressing generalizations across classes of items. These representations suggest a method for representing lexical categories and the type/token distinction.
科研通智能强力驱动
Strongly Powered by AbleSci AI