完整後設資料紀錄
DC 欄位語言
dc.contributor.authorAlajaji, Fen_US
dc.contributor.authorChen, PNen_US
dc.contributor.authorRached, Zen_US
dc.date.accessioned2014-12-08T15:39:22Z-
dc.date.available2014-12-08T15:39:22Z-
dc.date.issued2004-04-01en_US
dc.identifier.issn0018-9448en_US
dc.identifier.urihttp://dx.doi.org/10.1109/TIT.2004.825040en_US
dc.identifier.urihttp://hdl.handle.net/11536/26890-
dc.description.abstractIn [6], Csiszar established the concept of forward beta-cutoff rate for the error exponent hypothesis testing problem based on independent and identically distributed (i.i.d.) observations. Given beta < 0, he defined the forward beta-cutoff rate as the number R-0 greater than or equal to 0 that provides the best possible lower bound in the form beta(E - (R) over bar (0)) to the type 1 error exponent function for hypothesis testing where 0 < E < R-0 is the rate of exponential convergence to 0 of the type 2 error probably. He then demonstrated that the forward beta-cutoff rate is given by where D1/(1-beta)(Xparallel to(X) over bar) denotes the Renyi alpha-divergence [19], alpha > 0, alpha not equal 1. Similarly, for 0 < beta < 1, Csiszar also established the concept of reverse beta-cutoff rate for the correct exponent hypothesis testing problem. In this work, we extend Csiszar's results by investigating the forward and reverse beta-cutoff rates for the hypothesis testing between two arbitrary sources with memory. We demonstrate that the lim inf Renyi a-divergence rate provides the expression for the forward beta-cutoff rate. We also show that if the log-likelihood large deviation spectrum admits a limit, then the reverse beta-cutoff rate equals the liminf a-divergence rate, where alpha and 1/1-beta and 0 < beta < beta(max), where beta(max) is the largest beta < 1 for which the lim inf 1/1-beta-divergence rate is finite. For beta(max) less than or equal to beta < 1, we show that the reverse cutoff rate is in general only upper-bounded by the lim inf Renyi divergence rate. Unlike in [4], where the alphabet for the source coding cutoff rate problem was assumed to be finite, we assume arbitrary (countable or continuous) source alphabet. We also provide several examples to illustrate our forward and reverse beta-cutoff rates results and the techniques employed to establish them.en_US
dc.language.isoen_USen_US
dc.subjectalpha-divergence rateen_US
dc.subjectarbitrary sources with memoryen_US
dc.subjectforward and reverse cutoff ratesen_US
dc.subjecthypothesis testing error and correct exponentsen_US
dc.subjectinformation spectrumen_US
dc.subjectlarge deviation theoryen_US
dc.titleCsiszar's cutoff rates for the general hypothesis testing problemen_US
dc.typeArticle; Proceedings Paperen_US
dc.identifier.doi10.1109/TIT.2004.825040en_US
dc.identifier.journalIEEE TRANSACTIONS ON INFORMATION THEORYen_US
dc.citation.volume50en_US
dc.citation.issue4en_US
dc.citation.spage663en_US
dc.citation.epage678en_US
dc.contributor.department電信工程研究所zh_TW
dc.contributor.departmentInstitute of Communications Engineeringen_US
dc.identifier.wosnumberWOS:000220475700009-
顯示於類別:會議論文


文件中的檔案:

  1. 000220475700009.pdf

若為 zip 檔案,請下載檔案解壓縮後,用瀏覽器開啟資料夾中的 index.html 瀏覽全文。