度量(数据仓库)
表征(材料科学)
数学
分歧(语言学)
计算机科学
人工智能
数据挖掘
物理
语言学
哲学
光学
作者
Jinfeng Lin,S. K. M. Wong
标识
DOI:10.1080/03081079008935097
摘要
A new information-theoretic divergence measure is introduced and characterized. This new measure is related to the Kullback directed divergence but does not require the condition of absolute continuity to be satisfied by the probability distributions involved. Moreover, both the lower and upper bounds for the new measure are established in terms of the variational distance. A symmetric form of the divergence can also be defined and described by the Shannon entropy function. Other properties of the new divergences: nonnegativity, finiteness, semiboundedness, and boundedness are discussed
科研通智能强力驱动
Strongly Powered by AbleSci AI