Please use this identifier to cite or link to this item:
https://scholarhub.balamand.edu.lb/handle/uob/1674
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Blouet, Raphael | en_US |
dc.contributor.author | Mokbel, Chafic | en_US |
dc.contributor.author | Chollet, Gérard | en_US |
dc.date.accessioned | 2020-12-23T08:57:13Z | - |
dc.date.available | 2020-12-23T08:57:13Z | - |
dc.date.issued | 2004 | - |
dc.identifier.uri | https://scholarhub.balamand.edu.lb/handle/uob/1674 | - |
dc.description.abstract | This article presents BECARS (Balamand-ENST-CEDRE Automatic Recognition of Speakers): a free software for training Gaussian Mixture Models (GMM). BECARS permits the use of many classical adaptation techniques (such as MAP) and proposes original one (namely MAP_TREE and MAP_TREE_SPEC). In this paper, each of them are precisely described and evaluated on the data of the NIST2003 Speaker Verification Evaluation campaign [Przybocki, 2003]. We introduce this work with a recall of generalities on Automatic Speaker Verification (ASV). We then present main characteristics of Gaussian Mixture Models (GMM) which are the most common tool for speaker modelization in ASV system. Following is the describtion of each adaptation technique available in BECARS. We finally provide performances evaluation of each of them before concluding the paper. | en_US |
dc.format.extent | 4 p. | en_US |
dc.language.iso | fre | en_US |
dc.title | BECARS : un logiciel libre pour la vérification du locuteur | en_US |
dc.type | Journal Article | en_US |
dc.contributor.affiliation | Department of Electrical Engineering | en_US |
dc.description.startpage | 1 | en_US |
dc.description.endpage | 4 | en_US |
dc.date.catalogued | 2019-07-02 | - |
dc.description.status | Published | en_US |
dc.identifier.OlibID | 192613 | - |
dc.relation.ispartoftext | Journal of environmental protection | en_US |
dc.provenance.recordsource | Olib | en_US |
Appears in Collections: | Department of Electrical Engineering |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.