Full metadata record
DC FieldValueLanguage
dc.contributor.authorChen, PNen_US
dc.contributor.authorAlajaji, Fen_US
dc.date.accessioned2014-12-08T15:49:07Z-
dc.date.available2014-12-08T15:49:07Z-
dc.date.issued1998-05-01en_US
dc.identifier.issn0253-3839en_US
dc.identifier.urihttp://hdl.handle.net/11536/32649-
dc.description.abstractExpressions for epsilon-entropy rate, epsilon-mutual information rate and epsilon-divergence rate are introduced. These quantities, which consist of the quantiles of the asymptotic information spectra, generalize the inf/supentropy/information/divergence rates of Han and Verdu. The algebraic properties of these information measures are rigorously analyzed, and examples illustrating their use in the computation of the E-capacity are presented. In Part II of this work, these measures are employed to prove general source coding theorems for block codes, and the general formula of the Neyman-Pearson hypothesis testing type-II error exponent subject to upper bounds on the type-I error probability.en_US
dc.language.isoen_USen_US
dc.subjectinformation theoryen_US
dc.subjectentropyen_US
dc.subjectmutual informationen_US
dc.subjectdivergenceen_US
dc.subjecte-capacityen_US
dc.titleGeneralized source coding theorems and hypothesis testing: Part I - Information measuresen_US
dc.typeArticleen_US
dc.identifier.journalJOURNAL OF THE CHINESE INSTITUTE OF ENGINEERSen_US
dc.citation.volume21en_US
dc.citation.issue3en_US
dc.citation.spage283en_US
dc.citation.epage292en_US
dc.contributor.department電信工程研究所zh_TW
dc.contributor.departmentInstitute of Communications Engineeringen_US
dc.identifier.wosnumberWOS:000074038300004-
dc.citation.woscount2-
Appears in Collections:Articles


Files in This Item:

  1. 000074038300004.pdf

If it is a zip file, please download the file and unzip it, then open index.html in a browser to view the full text content.