標題: Generalized source coding theorems and hypothesis testing: Part I - Information measures
作者: Chen, PN
Alajaji, F
電信工程研究所
Institute of Communications Engineering
關鍵字: information theory;entropy;mutual information;divergence;e-capacity
公開日期: 1-五月-1998
摘要: Expressions for epsilon-entropy rate, epsilon-mutual information rate and epsilon-divergence rate are introduced. These quantities, which consist of the quantiles of the asymptotic information spectra, generalize the inf/supentropy/information/divergence rates of Han and Verdu. The algebraic properties of these information measures are rigorously analyzed, and examples illustrating their use in the computation of the E-capacity are presented. In Part II of this work, these measures are employed to prove general source coding theorems for block codes, and the general formula of the Neyman-Pearson hypothesis testing type-II error exponent subject to upper bounds on the type-I error probability.
URI: http://hdl.handle.net/11536/32649
ISSN: 0253-3839
期刊: JOURNAL OF THE CHINESE INSTITUTE OF ENGINEERS
Volume: 21
Issue: 3
起始頁: 283
結束頁: 292
顯示於類別:期刊論文


文件中的檔案:

  1. 000074038300004.pdf

若為 zip 檔案,請下載檔案解壓縮後,用瀏覽器開啟資料夾中的 index.html 瀏覽全文。