Full metadata record
DC FieldValueLanguage
dc.contributor.authorChang, Chih-Chcngen_US
dc.contributor.authorWu, Ming-Hungen_US
dc.contributor.authorLin, Jia-Weien_US
dc.contributor.authorLi, Chun-Hsienen_US
dc.contributor.authorParmar, Viveken_US
dc.contributor.authorLee, Heng-Yuanen_US
dc.contributor.authorWei, Jeng-Huaen_US
dc.contributor.authorSheu, Shyh-Shyuanen_US
dc.contributor.authorSuri, Mananen_US
dc.contributor.authorChang, Tian-Sheuanen_US
dc.contributor.authorHou, Tuo-Hungen_US
dc.date.accessioned2019-10-05T00:09:48Z-
dc.date.available2019-10-05T00:09:48Z-
dc.date.issued2019-01-01en_US
dc.identifier.isbn978-1-4503-6725-7en_US
dc.identifier.urihttp://dx.doi.org/10.1145/3316781.3317872en_US
dc.identifier.urihttp://hdl.handle.net/11536/152982-
dc.description.abstractBinary STT-MRAM is a highly anticipated embedded non-volatile memory technology in advanced logic nodes <28 nm. How to enable its in-memory computing (IMC) capability is critical for enhancing AI Edge. Based on the soon-available STT-MRAM, we report the first binary deep convolutional neural network (NV-BNN) capable of both local and remote learning. Exploiting intrinsic cumulative switching probability, accurate online training of CIFAR-10 color images (similar to 90%) is realized using a relaxed endurance spec (switching <= 20 times) and hybrid digital/IMC design. For offline training, the accuracy loss due to imprecise weight placement can be mitigated using a rapid non-iterative training-with-noise and fine-tuning scheme.en_US
dc.language.isoen_USen_US
dc.titleNV-BNN: An Accurate Deep Convolutional Neural Network Based on Binary STT-MRAM for Adaptive AI Edgeen_US
dc.typeProceedings Paperen_US
dc.identifier.doi10.1145/3316781.3317872en_US
dc.identifier.journalPROCEEDINGS OF THE 2019 56TH ACM/EDAC/IEEE DESIGN AUTOMATION CONFERENCE (DAC)en_US
dc.citation.spage0en_US
dc.citation.epage0en_US
dc.contributor.department電子工程學系及電子研究所zh_TW
dc.contributor.departmentDepartment of Electronics Engineering and Institute of Electronicsen_US
dc.identifier.wosnumberWOS:000482058200030en_US
dc.citation.woscount0en_US
Appears in Collections:Conferences Paper