Přejít na obsah

Detail publikace

Citace

Jan Vaněk and Lukáš Machlica and Josef Psutka : Estimation of Single-Gaussian and Gaussian Mixture Models for Pattern Recognition . 18th Iberoamerican Congress on Pattern Recognition, Lecture Notes in Computer Science, Springer, 2013.

PDF ke stažení

PDF

Abstrakt

Single-Gaussian and Gaussian-Mixture Models are utilized in various pattern recognition tasks. The model parameters are estimated usually via Maximum Likelihood Estimation (MLE) with respect to available training data. However, if only small amount of training data is available, the resulting model will not generalize well. Loosely speaking, classification performance given an unseen test set may be poor. In this paper, we propose a novel estimation technique of the model variances. Once the variances were estimated using MLE, they are multiplied by a scaling factor, which reflects the amount of uncertainty present in the limited sample set. The optimal value of the scaling factor is based on the Kullback-Leibler criterion and on the assumption that the training and test sets are sampled from the same source distribution. In addition, in the case of GMM, the proper number of components can be determined.

Abstrakt v češtině

Single-Gaussian and Gaussian-Mixture Models are utilized in various pattern recognition tasks. The model parameters are estimated usually via Maximum Likelihood Estimation (MLE) with respect to available training data. However, if only small amount of training data is available, the resulting model will not generalize well. Loosely speaking, classification performance given an unseen test set may be poor. In this paper, we propose a novel estimation technique of the model variances. Once the variances were estimated using MLE, they are multiplied by a scaling factor, which reflects the amount of uncertainty present in the limited sample set. The optimal value of the scaling factor is based on the Kullback-Leibler criterion and on the assumption that the training and test sets are sampled from the same source distribution. In addition, in the case of GMM, the proper number of components can be determined.

Detail publikace

Název: Estimation of Single-Gaussian and Gaussian Mixture Models for Pattern Recognition
Autor: Jan Vaněk ; Lukáš Machlica ; Josef Psutka
Název - česky: Estimation of Single-Gaussian and Gaussian Mixture Models for Pattern Recognition
Jazyk publikace: anglicky
Datum vydání: 21.12.2013
Rok vydání: 2013
Typ publikace: Stať ve sborníku
Název knihy: 18th Iberoamerican Congress on Pattern Recognition
Svazek: Lecture Notes in Computer Science
Nakladatel: Springer
/ 2014-11-12 12:20:17 /

Klíčová slova

Maximum Likelihood Estimation, Gaussian Mixture Model, Kullback-Leibler Divergence, Variance, Scaling

Klíčová slova v češtině

Maximum Likelihood Estimation, Gaussian Mixture Model, Kullback-Leibler Divergence, Variance, Scaling

BibTeX

@INPROCEEDINGS{JanVanek_2013_Estimationof,
 author = {Jan Van\v{e}k and Luk\'{a}\v{s} Machlica and Josef Psutka},
 title = {Estimation of Single-Gaussian and Gaussian Mixture Models for Pattern Recognition},
 year = {2013},
 publisher = {Springer},
 booktitle = {18th Iberoamerican Congress on Pattern Recognition},
 series = {Lecture Notes in Computer Science},
 url = {http://www.kky.zcu.cz/en/publications/JanVanek_2013_Estimationof},
}