Feature selection and multi-kernel learning for adaptive graph regularized nonnegative matrix factorization

Handle URI:
http://hdl.handle.net/10754/552351
Title:
Feature selection and multi-kernel learning for adaptive graph regularized nonnegative matrix factorization
Authors:
Wang, Jim Jing-Yan; Huang, Jianhua Z.; Sun, Yijun; Gao, Xin ( 0000-0002-7108-3574 )
Abstract:
Nonnegative matrix factorization (NMF), a popular part-based representation technique, does not capture the intrinsic local geometric structure of the data space. Graph regularized NMF (GNMF) was recently proposed to avoid this limitation by regularizing NMF with a nearest neighbor graph constructed from the input data set. However, GNMF has two main bottlenecks. First, using the original feature space directly to construct the graph is not necessarily optimal because of the noisy and irrelevant features and nonlinear distributions of data samples. Second, one possible way to handle the nonlinear distribution of data samples is by kernel embedding. However, it is often difficult to choose the most suitable kernel. To solve these bottlenecks, we propose two novel graph-regularized NMF methods, AGNMFFS and AGNMFMK, by introducing feature selection and multiple-kernel learning to the graph regularized NMF, respectively. Instead of using a fixed graph as in GNMF, the two proposed methods learn the nearest neighbor graph that is adaptive to the selected features and learned multiple kernels, respectively. For each method, we propose a unified objective function to conduct feature selection/multi-kernel learning, NMF and adaptive graph regularization simultaneously. We further develop two iterative algorithms to solve the two optimization problems. Experimental results on two challenging pattern classification tasks demonstrate that the proposed methods significantly outperform state-of-the-art data representation methods.
KAUST Department:
Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
Citation:
Feature selection and multi-kernel learning for adaptive graph regularized nonnegative matrix factorization 2015, 42 (3):1278 Expert Systems with Applications
Publisher:
Elsevier BV
Journal:
Expert Systems with Applications
Issue Date:
20-Sep-2014
DOI:
10.1016/j.eswa.2014.09.008
Type:
Article
ISSN:
09574174
Additional Links:
http://linkinghub.elsevier.com/retrieve/pii/S0957417414005478
Appears in Collections:
Articles; Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division

Full metadata record

DC FieldValue Language
dc.contributor.authorWang, Jim Jing-Yanen
dc.contributor.authorHuang, Jianhua Z.en
dc.contributor.authorSun, Yijunen
dc.contributor.authorGao, Xinen
dc.date.accessioned2015-05-06T13:16:16Zen
dc.date.available2015-05-06T13:16:16Zen
dc.date.issued2014-09-20en
dc.identifier.citationFeature selection and multi-kernel learning for adaptive graph regularized nonnegative matrix factorization 2015, 42 (3):1278 Expert Systems with Applicationsen
dc.identifier.issn09574174en
dc.identifier.doi10.1016/j.eswa.2014.09.008en
dc.identifier.urihttp://hdl.handle.net/10754/552351en
dc.description.abstractNonnegative matrix factorization (NMF), a popular part-based representation technique, does not capture the intrinsic local geometric structure of the data space. Graph regularized NMF (GNMF) was recently proposed to avoid this limitation by regularizing NMF with a nearest neighbor graph constructed from the input data set. However, GNMF has two main bottlenecks. First, using the original feature space directly to construct the graph is not necessarily optimal because of the noisy and irrelevant features and nonlinear distributions of data samples. Second, one possible way to handle the nonlinear distribution of data samples is by kernel embedding. However, it is often difficult to choose the most suitable kernel. To solve these bottlenecks, we propose two novel graph-regularized NMF methods, AGNMFFS and AGNMFMK, by introducing feature selection and multiple-kernel learning to the graph regularized NMF, respectively. Instead of using a fixed graph as in GNMF, the two proposed methods learn the nearest neighbor graph that is adaptive to the selected features and learned multiple kernels, respectively. For each method, we propose a unified objective function to conduct feature selection/multi-kernel learning, NMF and adaptive graph regularization simultaneously. We further develop two iterative algorithms to solve the two optimization problems. Experimental results on two challenging pattern classification tasks demonstrate that the proposed methods significantly outperform state-of-the-art data representation methods.en
dc.publisherElsevier BVen
dc.relation.urlhttp://linkinghub.elsevier.com/retrieve/pii/S0957417414005478en
dc.rightsArchived with thanks to Expert Systems with Applications. http://creativecommons.org/licenses/by-nc-nd/3.0/en
dc.subjectData representationen
dc.subjectNonnegative matrix factorizationen
dc.subjectGraph regularizationen
dc.subjectFeature selectionen
dc.subjectMulti-kernel learningen
dc.titleFeature selection and multi-kernel learning for adaptive graph regularized nonnegative matrix factorizationen
dc.typeArticleen
dc.contributor.departmentComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Divisionen
dc.identifier.journalExpert Systems with Applicationsen
dc.eprint.versionPublisher's Version/PDFen
dc.contributor.institutionDepartment of Statistics, Texas A&M University, TX 77843-3143, USAen
dc.contributor.institutionUniversity at Buffalo, The State University of New York, Buffalo, NY 14203, USAen
kaust.authorWang, Jim Jing-Yanen
kaust.authorGao, Xinen
All Items in KAUST are protected by copyright, with all rights reserved, unless otherwise indicated.