计算机科学
高斯分布
混合模型
模式识别(心理学)
训练集
高斯过程
人工智能
集合(抽象数据类型)
缩放比例
估计理论
Kullback-Leibler散度
最大似然
算法
统计
数学
物理
几何学
量子力学
程序设计语言
作者
Jan Vanĕk,Lukáš Machlica,Josef Psutka
出处
期刊:Springer eBooks
[Springer Nature]
日期:2013-11-20
卷期号:: 49-56
被引量:3
标识
DOI:10.1007/978-3-642-41822-8_7
摘要
Single-Gaussian and Gaussian-Mixture Models are utilized in various pattern recognition tasks. The model parameters are estimated usually via Maximum Likelihood Estimation MLE with respect to available training data. However, if only small amount of training data is available, the resulting model will not generalize well. Loosely speaking, classification performance given an unseen test set may be poor. In this paper, we propose a novel estimation technique of the model variances. Once the variances were estimated using MLE, they are multiplied by a scaling factor, which reflects the amount of uncertainty present in the limited sample set. The optimal value of the scaling factor is based on the Kullback-Leibler criterion and on the assumption that the training and test sets are sampled from the same source distribution. In addition, in the case of GMM, the proper number of components can be determined.
科研通智能强力驱动
Strongly Powered by AbleSci AI