完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.author | Chen, PN | en_US |
dc.contributor.author | Alajaji, F | en_US |
dc.date.accessioned | 2014-12-08T15:49:07Z | - |
dc.date.available | 2014-12-08T15:49:07Z | - |
dc.date.issued | 1998-05-01 | en_US |
dc.identifier.issn | 0253-3839 | en_US |
dc.identifier.uri | http://hdl.handle.net/11536/32649 | - |
dc.description.abstract | Expressions for epsilon-entropy rate, epsilon-mutual information rate and epsilon-divergence rate are introduced. These quantities, which consist of the quantiles of the asymptotic information spectra, generalize the inf/supentropy/information/divergence rates of Han and Verdu. The algebraic properties of these information measures are rigorously analyzed, and examples illustrating their use in the computation of the E-capacity are presented. In Part II of this work, these measures are employed to prove general source coding theorems for block codes, and the general formula of the Neyman-Pearson hypothesis testing type-II error exponent subject to upper bounds on the type-I error probability. | en_US |
dc.language.iso | en_US | en_US |
dc.subject | information theory | en_US |
dc.subject | entropy | en_US |
dc.subject | mutual information | en_US |
dc.subject | divergence | en_US |
dc.subject | e-capacity | en_US |
dc.title | Generalized source coding theorems and hypothesis testing: Part I - Information measures | en_US |
dc.type | Article | en_US |
dc.identifier.journal | JOURNAL OF THE CHINESE INSTITUTE OF ENGINEERS | en_US |
dc.citation.volume | 21 | en_US |
dc.citation.issue | 3 | en_US |
dc.citation.spage | 283 | en_US |
dc.citation.epage | 292 | en_US |
dc.contributor.department | 電信工程研究所 | zh_TW |
dc.contributor.department | Institute of Communications Engineering | en_US |
dc.identifier.wosnumber | WOS:000074038300004 | - |
dc.citation.woscount | 2 | - |
顯示於類別: | 期刊論文 |