Individuality- and Commonality-Based Multiview Multilabel Learning
Type
ArticleKAUST Department
Computer Science ProgramComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
Date
2019-11-19Online Publication Date
2019-11-19Print Publication Date
2020Permanent link to this record
http://hdl.handle.net/10754/660405
Metadata
Show full item recordAbstract
In multiview multilabel learning, each object is represented by several heterogeneous feature representations and is also annotated with a set of discrete nonexclusive labels. Previous studies typically focus on capturing the shared latent patterns among multiple views, while not sufficiently considering the diverse characteristics of individual views, which can cause performance degradation. In this article, we propose a novel approach [individuality- and commonality-based multiview multilabel learning (ICM2L)] to explicitly explore the individuality and commonality information of multilabel multiple view data in a unified model. Specifically, a common subspace is learned across different views to capture the shared patterns. Then, multiple individual classifiers are exploited to explore the characteristics of individual views. Next, an ensemble strategy is adopted to make a prediction. Finally, we develop an alternative solution to joinly optimize our model, which can enhance the robustness of the proposed model toward rare labels and reinforce the reciprocal effects of individuality and commonality among heterogeneous views, and thus further improve the performance. Experiments on various real-word datasets validate the effectiveness of ICM2L against the state-of-the-art solutions, and ICM2L can leverage the individuality and commonality information to achieve an improved performance as well as to enhance the robustness toward rare labelsCitation
Tan, Q., Yu, G., Wang, J., Domeniconi, C., & Zhang, X. (2019). Individuality- and Commonality-Based Multiview Multilabel Learning. IEEE Transactions on Cybernetics, 1–12. doi:10.1109/tcyb.2019.2950560Sponsors
The authors would like to thank the authors who kindly shared their source code and datasets with them for the experiments.Journal
IEEE Transactions on CyberneticsAdditional Links
https://ieeexplore.ieee.org/document/8906215/ae974a485f413a2113503eed53cd6c53
10.1109/tcyb.2019.2950560