Full metadata record
DC FieldValueLanguage
dc.contributor.authorTsai, SCen_US
dc.contributor.authorTzeng, WGen_US
dc.contributor.authorWu, HLen_US
dc.date.accessioned2014-12-08T15:18:35Z-
dc.date.available2014-12-08T15:18:35Z-
dc.date.issued2005-09-01en_US
dc.identifier.issn0018-9448en_US
dc.identifier.urihttp://dx.doi.org/10.1109/TIT.2005.853308en_US
dc.identifier.urihttp://hdl.handle.net/11536/13374-
dc.description.abstractWe study the distance measures between two probability distributions via two different distance metrics, a new metric induced from Jensen-Shannon divergence, and the well known L-1 metric. We show that several important results and constructions in computational complexity under the L-1 metric carry over to the new metric, such as Yao's next-bit predictor, the existence of extractors, the leftover hash lemma, and the construction of expander graph based extractor. Finally, we show that the useful parity lemma in studying pseudorandomness does not hold in the new metric.en_US
dc.language.isoen_USen_US
dc.subjectJensen-Shannon divergenceen_US
dc.subjectexpanderen_US
dc.subjectextractorsen_US
dc.subjectleftover hash lemmaen_US
dc.subjectparity lemmaen_US
dc.titleOn the Jensen-Shannon divergence and variational distanceen_US
dc.typeArticle; Proceedings Paperen_US
dc.identifier.doi10.1109/TIT.2005.853308en_US
dc.identifier.journalIEEE TRANSACTIONS ON INFORMATION THEORYen_US
dc.citation.volume51en_US
dc.citation.issue9en_US
dc.citation.spage3333en_US
dc.citation.epage3336en_US
dc.contributor.department資訊工程學系zh_TW
dc.contributor.departmentDepartment of Computer Scienceen_US
dc.identifier.wosnumberWOS:000231392900027-
Appears in Collections:Conferences Paper


Files in This Item:

  1. 000231392900027.pdf

If it is a zip file, please download the file and unzip it, then open index.html in a browser to view the full text content.