Full metadata record
DC FieldValueLanguage
dc.contributor.authorLee, Chia-Jungen_US
dc.contributor.authorLu, Chi-Jenen_US
dc.contributor.authorTsai, Shi-Chunen_US
dc.date.accessioned2014-12-08T15:20:49Z-
dc.date.available2014-12-08T15:20:49Z-
dc.date.issued2011-08-01en_US
dc.identifier.issn0018-9448en_US
dc.identifier.urihttp://dx.doi.org/10.1109/TIT.2011.2158897en_US
dc.identifier.urihttp://hdl.handle.net/11536/14805-
dc.description.abstractWe study the task of deterministically extracting randomness from sources containing computational entropy. The sources we consider have the form of a conditional distribution (f(X)vertical bar X), for some function f and some distribution X, and we say that such a source has computational min-entropy if any circuit of size 2(k) can only predict f(x) correctly with probability at most 2(-k) given input sampled from X. We first show that it is impossible to have a seedless extractor to extract from one single source of this kind. Then we show that it becomes possible if we are allowed a seed which is weakly random (instead of perfectly random) but contains some statistical min-entropy, or even a seed which is not random at all but contains some computational min-entropy. This can be seen as a step toward extending the study of multisource extractors from the traditional, statistical setting to a computational setting. We reduce the task of constructing such extractors to a problem in computational learning theory: learning linear functions under arbitrary distribution with adversarial noise, and we provide a learning algorithm for this problem. In fact, this problem is a well-recognized one in computational learning theory and variants of this problem have been studied intensively before. Thus, in addition to its application to extractors, our learning algorithm also has independent interest of its own, and it can be considered as the main technical contribution of this paper.en_US
dc.language.isoen_USen_US
dc.subjectComputational min-entropyen_US
dc.subjectrandomness extractorsen_US
dc.subjectlearning linear functionsen_US
dc.subjectcomputational complexityen_US
dc.titleExtracting Computational Entropy and Learning Noisy Linear Functionsen_US
dc.typeArticleen_US
dc.identifier.doi10.1109/TIT.2011.2158897en_US
dc.identifier.journalIEEE TRANSACTIONS ON INFORMATION THEORYen_US
dc.citation.volume57en_US
dc.citation.issue8en_US
dc.citation.spage5485en_US
dc.citation.epage5496en_US
dc.contributor.department資訊工程學系zh_TW
dc.contributor.departmentDepartment of Computer Scienceen_US
dc.identifier.wosnumberWOS:000295738500038-
dc.citation.woscount0-
Appears in Collections:Articles


Files in This Item:

  1. 000295738500038.pdf

If it is a zip file, please download the file and unzip it, then open index.html in a browser to view the full text content.