Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Fang, Wai-Chi | en_US |
dc.contributor.author | Wang, Kai-Yen | en_US |
dc.contributor.author | Fahier, Nicolas | en_US |
dc.contributor.author | Ho, Yun-Lung | en_US |
dc.contributor.author | Huang, Yu-De | en_US |
dc.date.accessioned | 2020-02-02T23:54:39Z | - |
dc.date.available | 2020-02-02T23:54:39Z | - |
dc.date.issued | 2019-12-01 | en_US |
dc.identifier.issn | 2156-3357 | en_US |
dc.identifier.uri | http://dx.doi.org/10.1109/JETCAS.2019.2951232 | en_US |
dc.identifier.uri | http://hdl.handle.net/11536/153596 | - |
dc.description.abstract | This study proposed an electroencephalogram (EEG)-based real-time emotion recognition hardware system architecture based on multiphase convolutional neural network (CNN) algorithm implemented on a 28-nm technology chip and on field programmable gate array (FPGA) for binary and quaternary classification. Sample entropy, differential asymmetry, short-time Fourier transform, and a channel reconstruction method were used for emotion feature extraction. In this work, six EEG channels were selected (FP1, FP2, F3, F4, F7, and F8), and EEG images were generated from spectrogram fusions. The complete CNN architecture included training and acceleration for efficient artificial intelligence (AI) edge application, and we proposed a multiphase CNN execution method to accommodate hardware resource constraints. Datasets of 32 subjects from the DEAP database were used to validate the proposed design, exhibiting mean accuracies for valance binary classification and valance-arousal quaternary classification of 83.36% and 76.67%, respectively. The core area and total power consumption of the CNN chip were 1.83 x 1.83 mm(2), respectively, and 76.61 mW. The chip operation was validated using ADVANTEST V93000 PS1600, and the training process and real-time classification processing time took 0.12495 ms and 0.02634 ms for each EEG image, respectively. The proposed EEG-based realtime emotion recognition system included a dry electrode EEG headset, feature extraction processor, CNN chip platform, and graphical user interface, and the execution time costed 450 ms for each emotional state recognition. | en_US |
dc.language.iso | en_US | en_US |
dc.subject | Electroencephalography | en_US |
dc.subject | Emotion recognition | en_US |
dc.subject | Real-time systems | en_US |
dc.subject | Artificial intelligence | en_US |
dc.subject | Feature extraction | en_US |
dc.subject | Convolutional neural networks | en_US |
dc.subject | System-on-chip | en_US |
dc.subject | Emotion recognition | en_US |
dc.subject | convolutional neural network (CNN) | en_US |
dc.subject | system-on-chip | en_US |
dc.subject | electroencephalography | en_US |
dc.subject | affective computing | en_US |
dc.title | Development and Validation of an EEG-Based Real-Time Emotion Recognition System Using Edge AI Computing Platform With Convolutional Neural Network System-on-Chip Design | en_US |
dc.type | Article | en_US |
dc.identifier.doi | 10.1109/JETCAS.2019.2951232 | en_US |
dc.identifier.journal | IEEE JOURNAL ON EMERGING AND SELECTED TOPICS IN CIRCUITS AND SYSTEMS | en_US |
dc.citation.volume | 9 | en_US |
dc.citation.issue | 4 | en_US |
dc.citation.spage | 645 | en_US |
dc.citation.epage | 657 | en_US |
dc.contributor.department | 電子工程學系及電子研究所 | zh_TW |
dc.contributor.department | Department of Electronics Engineering and Institute of Electronics | en_US |
dc.identifier.wosnumber | WOS:000502993500006 | en_US |
dc.citation.woscount | 0 | en_US |
Appears in Collections: | Articles |