Kullback-Leibler散度
数学
分歧(语言学)
熵(时间箭头)
信息论
雷诺熵
概率分布
距离测量
误差概率
香农信源编码定理
应用数学
离散数学
最大熵原理
组合数学
统计
计算机科学
算法
人工智能
二元熵函数
最大熵热力学
量子力学
物理
哲学
语言学
摘要
A novel class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known Kullback divergences, the new measures do not require the condition of absolute continuity to be satisfied by the probability distributions involved. More importantly, their close relationship with the variational distance and the probability of misclassification error are established in terms of bounds. These bounds are crucial in many applications of divergence measures. The measures are also well characterized by the properties of nonnegativity, finiteness, semiboundedness, and boundedness.< >
科研通智能强力驱动
Strongly Powered by AbleSci AI