Full metadata record
DC FieldValueLanguage
dc.contributor.authorHuang, Lijyunen_US
dc.contributor.authorLin, Kate Ching-Juen_US
dc.contributor.authorTseng, Yu-Cheeen_US
dc.date.accessioned2020-01-02T00:03:28Z-
dc.date.available2020-01-02T00:03:28Z-
dc.date.issued2019-01-01en_US
dc.identifier.isbn978-1-5386-9552-4en_US
dc.identifier.issn1945-7871en_US
dc.identifier.urihttp://dx.doi.org/10.1109/ICME.2019.00171en_US
dc.identifier.urihttp://hdl.handle.net/11536/153329-
dc.description.abstractAdvanced machine learning and deep learning techniques have increasingly improved accuracy of image classification. Most existing studies have investigated the data imbalance problem among classes to further enhance classification accuracy. However, less attention has been paid to data imbalance within every single class. In this work, we present AC-GAN (Actor-Critic Generative Adversarial Network), a data augmentation framework that explicitly considers heterogeneity of intra-class data. AC-GAN exploits a novel loss function to weigh the impacts of different subclasses of data in a class on GAN training. It hence can effectively generate fake data of both majority and minority subclasses, which help train a more accurate classifier. We use defect detection as an example application to evaluate our design. The results demonstrate that the intra-class distribution of fake data generated by our AC-GAN can be more similar to that of raw data. With balanced training for various subclasses, AC-GAN enhances classification accuracy for no matter uniformly or non-uniformly distributed intra-class data.en_US
dc.language.isoen_USen_US
dc.titleRESOLVING INTRA-CLASS IMBALANCE FOR GAN-BASED IMAGE AUGMENTATIONen_US
dc.typeProceedings Paperen_US
dc.identifier.doi10.1109/ICME.2019.00171en_US
dc.identifier.journal2019 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME)en_US
dc.citation.spage970en_US
dc.citation.epage975en_US
dc.contributor.department資訊工程學系zh_TW
dc.contributor.departmentDepartment of Computer Scienceen_US
dc.identifier.wosnumberWOS:000501820600163en_US
dc.citation.woscount0en_US
Appears in Collections:Conferences Paper