Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Huang, Mou-Yue | en_US |
dc.contributor.author | Lai, Ching-Hao | en_US |
dc.contributor.author | Chen, Sin-Horng | en_US |
dc.date.accessioned | 2018-08-21T05:57:14Z | - |
dc.date.available | 2018-08-21T05:57:14Z | - |
dc.date.issued | 2017-01-01 | en_US |
dc.identifier.issn | 1522-4880 | en_US |
dc.identifier.uri | http://hdl.handle.net/11536/147215 | - |
dc.description.abstract | In order to achieve higher accuracy of image recognition, deeper and wider networks have been used. However, when the network size gets bigger, its forward inference time also takes longer. To address this problem, we propose Deeply Fused Branchy Network (DFB-Net) by adding small but complete side branches to the target baseline main branch. DFB-Net allows easy-to-discriminate samples to be classified faster. For hard-to-discriminate samples, DFB-Net makes probability fusion by averaging softmax probabilities to make collaborative predictions. Extensive experiments on the two CIFAR datasets show that DFB-Net achieves state-ofthe-art results to obtain an error rate of 3.07% on CIFAR-10 and 16.01% on CIFAR-100. Meanwhile, the forward inference time (with a batch size of 1 and averaged among all test samples) only takes 10.4 ms on CIFAR-10, 18.8 ms on CIFAR-100, using GTX 1080 GPU with cuDNN 5.1. | en_US |
dc.language.iso | en_US | en_US |
dc.subject | Deep learning | en_US |
dc.subject | convolutional neural network | en_US |
dc.subject | image recognition | en_US |
dc.subject | classification | en_US |
dc.subject | inference time | en_US |
dc.title | FAST AND ACCURATE IMAGE RECOGNITION USING DEEPLY-FUSED BRANCHY NETWORKS | en_US |
dc.type | Proceedings Paper | en_US |
dc.identifier.journal | 2017 24TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP) | en_US |
dc.citation.spage | 2876 | en_US |
dc.citation.epage | 2880 | en_US |
dc.contributor.department | 電機工程學系 | zh_TW |
dc.contributor.department | Department of Electrical and Computer Engineering | en_US |
dc.identifier.wosnumber | WOS:000428410703001 | en_US |
Appears in Collections: | Conferences Paper |