OmniGA: Optimized Omnivariate Decision Trees for Generalizable Classification Models

Handle URI:
http://hdl.handle.net/10754/625185
Title:
OmniGA: Optimized Omnivariate Decision Trees for Generalizable Classification Models
Authors:
Magana-Mora, Arturo ( 0000-0001-8696-7068 ) ; Bajic, Vladimir B. ( 0000-0001-5435-4750 )
Abstract:
Classification problems from different domains vary in complexity, size, and imbalance of the number of samples from different classes. Although several classification models have been proposed, selecting the right model and parameters for a given classification task to achieve good performance is not trivial. Therefore, there is a constant interest in developing novel robust and efficient models suitable for a great variety of data. Here, we propose OmniGA, a framework for the optimization of omnivariate decision trees based on a parallel genetic algorithm, coupled with deep learning structure and ensemble learning methods. The performance of the OmniGA framework is evaluated on 12 different datasets taken mainly from biomedical problems and compared with the results obtained by several robust and commonly used machine-learning models with optimized parameters. The results show that OmniGA systematically outperformed these models for all the considered datasets, reducing the F score error in the range from 100% to 2.25%, compared to the best performing model. This demonstrates that OmniGA produces robust models with improved performance. OmniGA code and datasets are available at www.cbrc.kaust.edu.sa/omniga/.
KAUST Department:
Computational Bioscience Research Center (CBRC)
Citation:
Magana-Mora A, Bajic VB (2017) OmniGA: Optimized Omnivariate Decision Trees for Generalizable Classification Models. Scientific Reports 7. Available: http://dx.doi.org/10.1038/s41598-017-04281-9.
Publisher:
Springer Nature
Journal:
Scientific Reports
KAUST Grant Number:
BAS/1/1606-01-01
Issue Date:
14-Jun-2017
DOI:
10.1038/s41598-017-04281-9
Type:
Article
ISSN:
2045-2322
Sponsors:
This work was supported by King Abdullah University of Science and Technology (KAUST) through the baseline fund BAS/1/1606-01-01 of V.B.B. This research made use of the resources of CBRC at KAUST, Thuwal, Saudi Arabia.
Additional Links:
https://www.nature.com/articles/s41598-017-04281-9
Appears in Collections:
Articles; Computational Bioscience Research Center (CBRC)

Full metadata record

DC FieldValue Language
dc.contributor.authorMagana-Mora, Arturoen
dc.contributor.authorBajic, Vladimir B.en
dc.date.accessioned2017-07-12T07:20:55Z-
dc.date.available2017-07-12T07:20:55Z-
dc.date.issued2017-06-14en
dc.identifier.citationMagana-Mora A, Bajic VB (2017) OmniGA: Optimized Omnivariate Decision Trees for Generalizable Classification Models. Scientific Reports 7. Available: http://dx.doi.org/10.1038/s41598-017-04281-9.en
dc.identifier.issn2045-2322en
dc.identifier.doi10.1038/s41598-017-04281-9en
dc.identifier.urihttp://hdl.handle.net/10754/625185-
dc.description.abstractClassification problems from different domains vary in complexity, size, and imbalance of the number of samples from different classes. Although several classification models have been proposed, selecting the right model and parameters for a given classification task to achieve good performance is not trivial. Therefore, there is a constant interest in developing novel robust and efficient models suitable for a great variety of data. Here, we propose OmniGA, a framework for the optimization of omnivariate decision trees based on a parallel genetic algorithm, coupled with deep learning structure and ensemble learning methods. The performance of the OmniGA framework is evaluated on 12 different datasets taken mainly from biomedical problems and compared with the results obtained by several robust and commonly used machine-learning models with optimized parameters. The results show that OmniGA systematically outperformed these models for all the considered datasets, reducing the F score error in the range from 100% to 2.25%, compared to the best performing model. This demonstrates that OmniGA produces robust models with improved performance. OmniGA code and datasets are available at www.cbrc.kaust.edu.sa/omniga/.en
dc.description.sponsorshipThis work was supported by King Abdullah University of Science and Technology (KAUST) through the baseline fund BAS/1/1606-01-01 of V.B.B. This research made use of the resources of CBRC at KAUST, Thuwal, Saudi Arabia.en
dc.publisherSpringer Natureen
dc.relation.urlhttps://www.nature.com/articles/s41598-017-04281-9en
dc.rightsThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.en
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/en
dc.titleOmniGA: Optimized Omnivariate Decision Trees for Generalizable Classification Modelsen
dc.typeArticleen
dc.contributor.departmentComputational Bioscience Research Center (CBRC)en
dc.identifier.journalScientific Reportsen
dc.eprint.versionPublisher's Version/PDFen
kaust.authorMagana-Mora, Arturoen
kaust.authorBajic, Vladimir B.en
kaust.grant.numberBAS/1/1606-01-01en
All Items in KAUST are protected by copyright, with all rights reserved, unless otherwise indicated.