完整後設資料紀錄
DC 欄位語言
dc.contributor.authorFang, Wai-Chien_US
dc.contributor.authorWang, Kai-Yenen_US
dc.contributor.authorFahier, Nicolasen_US
dc.contributor.authorHo, Yun-Lungen_US
dc.contributor.authorHuang, Yu-Deen_US
dc.date.accessioned2020-02-02T23:54:39Z-
dc.date.available2020-02-02T23:54:39Z-
dc.date.issued2019-12-01en_US
dc.identifier.issn2156-3357en_US
dc.identifier.urihttp://dx.doi.org/10.1109/JETCAS.2019.2951232en_US
dc.identifier.urihttp://hdl.handle.net/11536/153596-
dc.description.abstractThis study proposed an electroencephalogram (EEG)-based real-time emotion recognition hardware system architecture based on multiphase convolutional neural network (CNN) algorithm implemented on a 28-nm technology chip and on field programmable gate array (FPGA) for binary and quaternary classification. Sample entropy, differential asymmetry, short-time Fourier transform, and a channel reconstruction method were used for emotion feature extraction. In this work, six EEG channels were selected (FP1, FP2, F3, F4, F7, and F8), and EEG images were generated from spectrogram fusions. The complete CNN architecture included training and acceleration for efficient artificial intelligence (AI) edge application, and we proposed a multiphase CNN execution method to accommodate hardware resource constraints. Datasets of 32 subjects from the DEAP database were used to validate the proposed design, exhibiting mean accuracies for valance binary classification and valance-arousal quaternary classification of 83.36% and 76.67%, respectively. The core area and total power consumption of the CNN chip were 1.83 x 1.83 mm(2), respectively, and 76.61 mW. The chip operation was validated using ADVANTEST V93000 PS1600, and the training process and real-time classification processing time took 0.12495 ms and 0.02634 ms for each EEG image, respectively. The proposed EEG-based realtime emotion recognition system included a dry electrode EEG headset, feature extraction processor, CNN chip platform, and graphical user interface, and the execution time costed 450 ms for each emotional state recognition.en_US
dc.language.isoen_USen_US
dc.subjectElectroencephalographyen_US
dc.subjectEmotion recognitionen_US
dc.subjectReal-time systemsen_US
dc.subjectArtificial intelligenceen_US
dc.subjectFeature extractionen_US
dc.subjectConvolutional neural networksen_US
dc.subjectSystem-on-chipen_US
dc.subjectEmotion recognitionen_US
dc.subjectconvolutional neural network (CNN)en_US
dc.subjectsystem-on-chipen_US
dc.subjectelectroencephalographyen_US
dc.subjectaffective computingen_US
dc.titleDevelopment and Validation of an EEG-Based Real-Time Emotion Recognition System Using Edge AI Computing Platform With Convolutional Neural Network System-on-Chip Designen_US
dc.typeArticleen_US
dc.identifier.doi10.1109/JETCAS.2019.2951232en_US
dc.identifier.journalIEEE JOURNAL ON EMERGING AND SELECTED TOPICS IN CIRCUITS AND SYSTEMSen_US
dc.citation.volume9en_US
dc.citation.issue4en_US
dc.citation.spage645en_US
dc.citation.epage657en_US
dc.contributor.department電子工程學系及電子研究所zh_TW
dc.contributor.departmentDepartment of Electronics Engineering and Institute of Electronicsen_US
dc.identifier.wosnumberWOS:000502993500006en_US
dc.citation.woscount0en_US
顯示於類別:期刊論文