標題: Generalized source coding theorems and hypothesis testing: Part I - Information measures
作者: Chen, PN
Alajaji, F
電信工程研究所
Institute of Communications Engineering
關鍵字: information theory;entropy;mutual information;divergence;e-capacity
公開日期: 1-May-1998
摘要: Expressions for epsilon-entropy rate, epsilon-mutual information rate and epsilon-divergence rate are introduced. These quantities, which consist of the quantiles of the asymptotic information spectra, generalize the inf/supentropy/information/divergence rates of Han and Verdu. The algebraic properties of these information measures are rigorously analyzed, and examples illustrating their use in the computation of the E-capacity are presented. In Part II of this work, these measures are employed to prove general source coding theorems for block codes, and the general formula of the Neyman-Pearson hypothesis testing type-II error exponent subject to upper bounds on the type-I error probability.
URI: http://hdl.handle.net/11536/32649
ISSN: 0253-3839
期刊: JOURNAL OF THE CHINESE INSTITUTE OF ENGINEERS
Volume: 21
Issue: 3
起始頁: 283
結束頁: 292
Appears in Collections:Articles


Files in This Item:

  1. 000074038300004.pdf

If it is a zip file, please download the file and unzip it, then open index.html in a browser to view the full text content.