Type
Conference PaperAuthors
Wang, Jim Jing-YanGao, Xin

KAUST Department
Computational Bioscience Research Center (CBRC)Computer Science Program
Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
Date
2014-09-10Preprint Posting Date
2013-11-26Online Publication Date
2014-09-10Print Publication Date
2014-07Permanent link to this record
http://hdl.handle.net/10754/556650
Metadata
Show full item recordAbstract
Sparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.Citation
Wang, J. J.-Y., & Gao, X. (2014). Semi-supervised sparse coding. 2014 International Joint Conference on Neural Networks (IJCNN). doi:10.1109/ijcnn.2014.6889449Conference/Event name
2014 International Joint Conference on Neural Networks, IJCNN 2014arXiv
1311.6834Additional Links
http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=6889449http://arxiv.org/abs/1311.6834
ae974a485f413a2113503eed53cd6c53
10.1109/IJCNN.2014.6889449