完整後設資料紀錄
DC 欄位語言
dc.contributor.authorLee, Chia-Jungen_US
dc.contributor.authorLu, Chi-Jenen_US
dc.contributor.authorTsai, Shi-Chunen_US
dc.date.accessioned2014-12-08T15:22:22Z-
dc.date.available2014-12-08T15:22:22Z-
dc.date.issued2009en_US
dc.identifier.isbn978-3-642-02881-6en_US
dc.identifier.issn0302-9743en_US
dc.identifier.urihttp://hdl.handle.net/11536/15840-
dc.description.abstractWe study the task of deterministically extracting randomness from sources containing computational entropy. The sources we consider have the form of a conditional distribution (f(X)vertical bar X), for some function f and some distribution X, and we say that such a source has computational min-entropy k if any circuit of size 2(k) can only predict f(x) correctly with probability at most 2(-k) given input x sampled from X. We first show that it is impossible to have a seedless extractor to extract from one single source of this kind. Then we show that it becomes possible if we are allowed a seed which is weakly random (instead of perfectly random) but contains some statistical min-entropy, or even a seed which is not random at all but contains some computational min-entropy. This can be seen as a step toward extending the study of multi-source extractors from the traditional, statistical setting to a computational setting. We reduce the task of constructing such extractors to a problem in learning theory: learning linear functions under arbitrary distribution with adversarial noise. For this problem, we provide a learning algorithm, which may have interest of its own.en_US
dc.language.isoen_USen_US
dc.titleExtracting Computational Entropy and Learning Noisy Linear Functionsen_US
dc.typeProceedings Paperen_US
dc.identifier.journalCOMPUTING AND COMBINATORICS, PROCEEDINGSen_US
dc.citation.volume5609en_US
dc.citation.spage338en_US
dc.citation.epage347en_US
dc.contributor.department資訊工程學系zh_TW
dc.contributor.departmentDepartment of Computer Scienceen_US
dc.identifier.wosnumberWOS:000269148100034-
顯示於類別:會議論文