Show simple item record

dc.contributor.authorOu, Guangjin
dc.contributor.authorYu, Guoxian
dc.contributor.authorDomeniconi, Carlotta
dc.contributor.authorLu, Xuequan
dc.contributor.authorZhang, Xiangliang
dc.date.accessioned2020-09-30T12:54:36Z
dc.date.available2020-09-30T12:54:36Z
dc.date.issued2020-09-21
dc.date.submitted2020-06-19
dc.identifier.citationOu, G., Yu, G., Domeniconi, C., Lu, X., & Zhang, X. (2020). Multi-label zero-shot learning with graph convolutional networks. Neural Networks, 132, 333–341. doi:10.1016/j.neunet.2020.09.010
dc.identifier.issn0893-6080
dc.identifier.pmid32977278
dc.identifier.doi10.1016/j.neunet.2020.09.010
dc.identifier.urihttp://hdl.handle.net/10754/665385
dc.description.abstractThe goal of zero-shot learning (ZSL) is to build a classifier that recognizes novel categories with no corresponding annotated training data. The typical routine is to transfer knowledge from seen classes to unseen ones by learning a visual-semantic embedding. Existing multi-label zero-shot learning approaches either ignore correlations among labels, suffer from large label combinations, or learn the embedding using only local or global visual features. In this paper, we propose a Graph Convolution Networks based Multi-label Zero-Shot Learning model, abbreviated as MZSL-GCN. Our model first constructs a label relation graph using label co-occurrences and compensates the absence of unseen labels in the training phase by semantic similarity. It then takes the graph and the word embedding of each seen (unseen) label as inputs to the GCN to learn the label semantic embedding, and to obtain a set of inter-dependent object classifiers. MZSL-GCN simultaneously trains another attention network to learn compatible local and global visual features of objects with respect to the classifiers, and thus makes the whole network end-to-end trainable. In addition, the use of unlabeled training data can reduce the bias toward seen labels and boost the generalization ability. Experimental results on benchmark datasets show that our MZSL-GCN competes with state-of-the-art approaches.
dc.description.sponsorshipThis work was supported by National Natural Science Foundation of China (62031003, 61872300 and 62072380).
dc.publisherElsevier BV
dc.relation.urlhttps://linkinghub.elsevier.com/retrieve/pii/S0893608020303336
dc.rightsNOTICE: this is the author’s version of a work that was accepted for publication in Neural Networks. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Neural Networks, [132, , (2020-09-21)] DOI: 10.1016/j.neunet.2020.09.010 . © 2020. This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/
dc.titleMulti-label zero-shot learning with graph convolutional networks
dc.typeArticle
dc.contributor.departmentComputer Science Program
dc.contributor.departmentComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
dc.contributor.departmentMachine Intelligence & kNowledge Engineering Lab
dc.identifier.journalNeural Networks
dc.rights.embargodate2022-09-22
dc.eprint.versionPost-print
dc.contributor.institutionSchool of Software, Shandong University, Jinan, China
dc.contributor.institutionCollege of Computer and Information Sciences, Southwest University, Chongqing, China
dc.contributor.institutionDepartment of Computer Science, George Mason University, Fairfax, VA, USA
dc.contributor.institutionSchool of Information Technology, Deakin University, Australia
dc.identifier.volume132
dc.identifier.pages333-341
kaust.personYu, Guoxian
kaust.personZhang, Xiangliang
dc.date.accepted2020-09-11
dc.identifier.eid2-s2.0-85091232142
refterms.dateFOA2020-10-01T05:34:10Z
dc.date.published-online2020-09-21
dc.date.published-print2020-12


Files in this item

Thumbnail
Name:
MZSL_GCN.pdf
Size:
578.9Kb
Format:
PDF
Description:
Accepted manuscript
Embargo End Date:
2022-09-22

This item appears in the following Collection(s)

Show simple item record