Full metadata record
DC FieldValueLanguage
dc.contributor.authorHuang, Kevin P. -Y.en_US
dc.contributor.authorWen, Charles H. -P.en_US
dc.contributor.authorChiueh, Hermingen_US
dc.date.accessioned2017-04-21T06:49:57Z-
dc.date.available2017-04-21T06:49:57Z-
dc.date.issued2014en_US
dc.identifier.isbn978-1-4799-6123-8en_US
dc.identifier.urihttp://dx.doi.org/10.1109/HPCC.2014.166en_US
dc.identifier.urihttp://hdl.handle.net/11536/136138-
dc.description.abstractHilbert-Huang Transform (HHT) is a process of adaptive analysis applicable to non-linear and non-stationary data such as voice and biomedical signals. Empirical Mode Decomposition (EMD) is a key in HHT and decomposes data into multiple Intrinsic Mode Functions (IMFs). Traditionally, EMD is computed on all data points in a serial manner, thus making its execution time grows at least linearly with the data size. In this work, a 3-stage parallelized EMD algorithm working on a CUDA architecture is proposed to improve performance over traditional EMD. Moreover, additional merging cubic spline interpolation (MCSI) and GPU acceleration techniques are also incorporated for achieving high parallelism and high accuracy. Experimental result shows that our parallelized EMD in CUDA achieves 37.9x and 33.7X speedups with 0.0051% and 0.002% errors on voice and EEG datasets of 1-million points, respectively.en_US
dc.language.isoen_USen_US
dc.subjectHHTen_US
dc.subjectEMDen_US
dc.subjectGPGPUen_US
dc.subjectCUDAen_US
dc.titleFlexible Parallelized Empirical Mode Decomposition in CUDA for Hilbert Huang Transformen_US
dc.typeProceedings Paperen_US
dc.identifier.doi10.1109/HPCC.2014.166en_US
dc.identifier.journal2014 IEEE INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPUTING AND COMMUNICATIONS, 2014 IEEE 6TH INTL SYMP ON CYBERSPACE SAFETY AND SECURITY, 2014 IEEE 11TH INTL CONF ON EMBEDDED SOFTWARE AND SYST (HPCC,CSS,ICESS)en_US
dc.citation.spage1125en_US
dc.citation.epage1133en_US
dc.contributor.department電機學院zh_TW
dc.contributor.departmentCollege of Electrical and Computer Engineeringen_US
dc.identifier.wosnumberWOS:000380560600175en_US
dc.citation.woscount0en_US
Appears in Collections:Conferences Paper