信息论
观察员(物理)
探测理论
高斯分布
上下界
计算机科学
概率分布
心理信息
数学理论
理论计算机科学
心理物理学
刺激(心理学)
人工智能
算法
数学
统计
感知
认知心理学
心理学
量子力学
数学分析
电信
神经科学
法学
政治学
梅德林
物理
探测器
出处
期刊:Psychological Review
[American Psychological Association]
日期:2021-10-01
卷期号:128 (5): 976-987
被引量:6
摘要
Signal detection theory (SDT), the standard mathematical framework by which we understand how stimuli are classified into distributions such as signal or noise, is an essential part of the modern psychologist's toolkit. This article introduces some mathematical tools derived from information theory which allow surprisingly simple approximations to key quantities in SDT. The main idea is a lower bound on the probability of correct classification of a stimulus, as a function of information-theoretic properties of the generating distribution. This bound depends on three distinct factors, each of which can be quantified information-theoretically: (a) The prior uncertainty in the choice of generating distribution; (b) the inherent separability of the classes; and (c) the discrepancy between the observer's model of the class distributions and the "true" model. The bound is only a loose substitute for the conventional method for computing proportion correct (via integration) but generalizes readily to multiple dimensions and larger numbers of stimulus categories, where direct integration is computationally difficult. Moreover, unlike most conventional SDT formulae, this bound does not require Gaussian distributions. Most importantly, the information-theoretic signal detection theory (IT-SDT) framework sheds light on the way classification performance depends on the discrepancy between the observer's assumptions and those actually governing the environment. (PsycInfo Database Record (c) 2021 APA, all rights reserved).
科研通智能强力驱动
Strongly Powered by AbleSci AI