贝叶斯网络
条件熵
熵(时间箭头)
上下界
最大熵原理
数学
条件概率
联合熵
计算机科学
传递熵
条件概率分布
人工智能
机器学习
算法
统计
物理
数学分析
量子力学
作者
Mathieu Serrurier,Henri Prade
出处
期刊:Advances in intelligent systems and computing
日期:2015-01-01
卷期号:: 87-95
标识
DOI:10.1007/978-3-319-10765-3_11
摘要
The most common way to learn the structure of Bayesian networks is to use a score function together with an optimization process. When no prior knowledge is available over the structure, score functions based on information theory are used to balance the entropy of the conditional probability tables with network complexity. Clearly, this complexity has a high impact on the uncertainty about the estimation of the conditional distributions. However, this complexity is estimated independently of the computation of the entropy and thus does not faithfully handle the uncertainty about the estimation. In this paper we propose a new entropy function based on a “possibilistic upper entropy” which relies on the entropy of a possibility distribution that encodes an upper bound of the estimation of the frequencies. Since the network structure has a direct effect on the number of pieces of data available for probability estimation, the possibilistic upper entropy is of an effective interest for learning the structure of the network. We also show that possibilistic upper entropy can be used for obtaining an incremental algorithm for the online learning of Bayesian network.
科研通智能强力驱动
Strongly Powered by AbleSci AI