完整後設資料紀錄
DC 欄位語言
dc.contributor.authorTsai, SCen_US
dc.contributor.authorTzeng, WGen_US
dc.contributor.authorWu, HLen_US
dc.date.accessioned2014-12-08T15:18:35Z-
dc.date.available2014-12-08T15:18:35Z-
dc.date.issued2005-09-01en_US
dc.identifier.issn0018-9448en_US
dc.identifier.urihttp://dx.doi.org/10.1109/TIT.2005.853308en_US
dc.identifier.urihttp://hdl.handle.net/11536/13374-
dc.description.abstractWe study the distance measures between two probability distributions via two different distance metrics, a new metric induced from Jensen-Shannon divergence, and the well known L-1 metric. We show that several important results and constructions in computational complexity under the L-1 metric carry over to the new metric, such as Yao's next-bit predictor, the existence of extractors, the leftover hash lemma, and the construction of expander graph based extractor. Finally, we show that the useful parity lemma in studying pseudorandomness does not hold in the new metric.en_US
dc.language.isoen_USen_US
dc.subjectJensen-Shannon divergenceen_US
dc.subjectexpanderen_US
dc.subjectextractorsen_US
dc.subjectleftover hash lemmaen_US
dc.subjectparity lemmaen_US
dc.titleOn the Jensen-Shannon divergence and variational distanceen_US
dc.typeArticle; Proceedings Paperen_US
dc.identifier.doi10.1109/TIT.2005.853308en_US
dc.identifier.journalIEEE TRANSACTIONS ON INFORMATION THEORYen_US
dc.citation.volume51en_US
dc.citation.issue9en_US
dc.citation.spage3333en_US
dc.citation.epage3336en_US
dc.contributor.department資訊工程學系zh_TW
dc.contributor.departmentDepartment of Computer Scienceen_US
dc.identifier.wosnumberWOS:000231392900027-
顯示於類別:會議論文


文件中的檔案:

  1. 000231392900027.pdf

若為 zip 檔案,請下載檔案解壓縮後,用瀏覽器開啟資料夾中的 index.html 瀏覽全文。