Semi-supervised sparse coding

Handle URI:
http://hdl.handle.net/10754/556650
Title:
Semi-supervised sparse coding
Authors:
Wang, Jim Jing-Yan; Gao, Xin ( 0000-0002-7108-3574 )
Abstract:
Sparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.
KAUST Department:
Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
Publisher:
Institute of Electrical & Electronics Engineers (IEEE)
Journal:
2014 International Joint Conference on Neural Networks (IJCNN)
Conference/Event name:
2014 International Joint Conference on Neural Networks, IJCNN 2014
Issue Date:
6-Jul-2014
DOI:
10.1109/IJCNN.2014.6889449
ARXIV:
arXiv:1311.6834
Type:
Conference Paper
Additional Links:
http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=6889449; http://arxiv.org/abs/1311.6834
Appears in Collections:
Conference Papers; Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division

Full metadata record

DC FieldValue Language
dc.contributor.authorWang, Jim Jing-Yanen
dc.contributor.authorGao, Xinen
dc.date.accessioned2015-06-10T11:40:41Zen
dc.date.available2015-06-10T11:40:41Zen
dc.date.issued2014-07-06en
dc.identifier.doi10.1109/IJCNN.2014.6889449en
dc.identifier.urihttp://hdl.handle.net/10754/556650en
dc.description.abstractSparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.en
dc.publisherInstitute of Electrical & Electronics Engineers (IEEE)en
dc.relation.urlhttp://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=6889449en
dc.relation.urlhttp://arxiv.org/abs/1311.6834en
dc.rights(c) 2014 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.en
dc.titleSemi-supervised sparse codingen
dc.typeConference Paperen
dc.contributor.departmentComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Divisionen
dc.identifier.journal2014 International Joint Conference on Neural Networks (IJCNN)en
dc.conference.date2014-07-06 to 2014-07-11en
dc.conference.name2014 International Joint Conference on Neural Networks, IJCNN 2014en
dc.conference.locationBeijing, CHNen
dc.eprint.versionPost-printen
dc.contributor.institutionUniversity at Buffalo, The State University of New York, Buffalo, NY 14203, USAen
dc.contributor.institutionProvincial Key Laboratory for Computer Information Processing Technology, Soochow University, Suzhou, 215006, Chinaen
dc.identifier.arxividarXiv:1311.6834en
kaust.authorGao, Xinen
All Items in KAUST are protected by copyright, with all rights reserved, unless otherwise indicated.