標題: Tensor-Factorized Neural Networks
作者: Chien, Jen-Tzung
Bao, Yi-Ting
電機工程學系
Department of Electrical and Computer Engineering
關鍵字: Neural network (NN);pattern classification;tensor factorization (TF);and tensor-factorized error backpropagation
公開日期: 1-五月-2018
摘要: The growing interests in multiway data analysis and deep learning have drawn tensor factorization (TF) and neural network (NN) as the crucial topics. Conventionally, the NN model is estimated from a set of one-way observations. Such a vectorized NN is not generalized for learning the representation from multiway observations. The classification performance using vectorized NN is constrained, because the temporal or spatial information in neighboring ways is disregarded. More parameters are required to learn the complicated data structure. This paper presents a new tensor-factorized NN (TFNN), which tightly integrates TF and NN for multiway feature extraction and classification under a unified discriminative objective. This TFNN is seen as a generalized NN, where the affine transformation in an NN is replaced by the multilinear and multiway factorization for tensor-based NN. The multiway information is preserved through layerwise factorization. Tucker decomposition and nonlinear activation are performed in each hidden layer. The tensor-factorized error backpropagation is developed to train TFNN with the limited parameter size and computation time. This TFNN can be further extended to realize the convolutional TFNN (CTFNN) by looking at small subtensors through the factorized convolution. Experiments on real-world classification tasks demonstrate that TFNN and CTFNN attain substantial improvement when compared with an NN and a convolutional NN, respectively.
URI: http://dx.doi.org/10.1109/TNNLS.2017.2690379
http://hdl.handle.net/11536/144901
ISSN: 2162-237X
DOI: 10.1109/TNNLS.2017.2690379
期刊: IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
Volume: 29
起始頁: 1998
結束頁: 2011
顯示於類別:期刊論文