完整後設資料紀錄
DC 欄位語言
dc.contributor.authorLiu, Shing-Jiuanen_US
dc.contributor.authorChang, Ronald Y.en_US
dc.contributor.authorChien, Feng-Tsunen_US
dc.date.accessioned2020-10-05T02:01:27Z-
dc.date.available2020-10-05T02:01:27Z-
dc.date.issued2019-01-01en_US
dc.identifier.isbn978-1-7281-0962-6en_US
dc.identifier.issn2334-0983en_US
dc.identifier.urihttp://hdl.handle.net/11536/155233-
dc.description.abstractDevice-free indoor localization is a key enabling technology for many Internet of Things (IoT) applications. Deep neural network (DNN)-based location estimators achieve high-precision localization performance by automatically learning discriminative features from noisy wireless signals without much human intervention. However, the inner workings of DNN are not transparent and not adequately understood especially in wireless localization applications. In this paper, we conduct visual analyses of DNN-based location estimators trained with WiFi channel state information (CSI) fingerprints in a real-world experiment. We address such questions as 1) how well has the DNN learned and been trained, and 2) what critical features has the DNN learned to distinguish different classes, via visualization techniques. The results provide plausible explanations and allow for a better understanding of the mechanism of DNN-based wireless indoor localization.en_US
dc.language.isoen_USen_US
dc.titleVisual Analysis of Deep Neural Networks for Device-Free Wireless Localizationen_US
dc.typeProceedings Paperen_US
dc.identifier.journal2019 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM)en_US
dc.citation.spage0en_US
dc.citation.epage0en_US
dc.contributor.department電子工程學系及電子研究所zh_TW
dc.contributor.departmentDepartment of Electronics Engineering and Institute of Electronicsen_US
dc.identifier.wosnumberWOS:000552238602022en_US
dc.citation.woscount0en_US
顯示於類別:會議論文