Full metadata record
DC FieldValueLanguage
dc.contributor.authorCheng, SSen_US
dc.contributor.authorWang, HMen_US
dc.contributor.authorFu, HCen_US
dc.date.accessioned2014-12-08T15:37:10Z-
dc.date.available2014-12-08T15:37:10Z-
dc.date.issued2004-12-15en_US
dc.identifier.issn1110-8657en_US
dc.identifier.urihttp://dx.doi.org/10.1155/S1110865704407100en_US
dc.identifier.urihttp://hdl.handle.net/11536/25551-
dc.description.abstractWe propose a self-splitting Gaussian mixture learning (SGML) algorithm for Gaussian mixture modelling. The SGML algorithm is deterministic and is able to find an appropriate number of components of the Gaussian mixture model (GMM) based on a self-splitting validity measure, Bayesian information criterion (BIC). It starts with a single component in the feature space and splits adaptively during the learning process until the most appropriate number of components is found. The SGML algorithm also performs well in learning the GMM with a given component number. In our experiments on clustering of a synthetic data set and the text-independent speaker identification task, we have observed the ability of the SGML for model-based clustering and automatically determining the model complexity of the speaker GMMs for speaker identification.en_US
dc.language.isoen_USen_US
dc.subjectunsupervised learningen_US
dc.subjectGaussian mixture modellingen_US
dc.subjectBayesian information criterionen_US
dc.subjectspeaker identificationen_US
dc.titleA model-selection-based self-splitting Gaussian mixture learning with application to speaker identificationen_US
dc.typeArticleen_US
dc.identifier.doi10.1155/S1110865704407100en_US
dc.identifier.journalEURASIP JOURNAL ON APPLIED SIGNAL PROCESSINGen_US
dc.citation.volume2004en_US
dc.citation.issue17en_US
dc.citation.spage2626en_US
dc.citation.epage2639en_US
dc.contributor.department資訊工程學系zh_TW
dc.contributor.departmentDepartment of Computer Scienceen_US
dc.identifier.wosnumberWOS:000227870700003-
dc.citation.woscount4-
Appears in Collections:Articles


Files in This Item:

  1. 000227870700003.pdf

If it is a zip file, please download the file and unzip it, then open index.html in a browser to view the full text content.