OmniGA: Optimized Omnivariate Decision Trees for Generalizable Classification Models

Abstract
Classification problems from different domains vary in complexity, size, and imbalance of the number of samples from different classes. Although several classification models have been proposed, selecting the right model and parameters for a given classification task to achieve good performance is not trivial. Therefore, there is a constant interest in developing novel robust and efficient models suitable for a great variety of data. Here, we propose OmniGA, a framework for the optimization of omnivariate decision trees based on a parallel genetic algorithm, coupled with deep learning structure and ensemble learning methods. The performance of the OmniGA framework is evaluated on 12 different datasets taken mainly from biomedical problems and compared with the results obtained by several robust and commonly used machine-learning models with optimized parameters. The results show that OmniGA systematically outperformed these models for all the considered datasets, reducing the F score error in the range from 100% to 2.25%, compared to the best performing model. This demonstrates that OmniGA produces robust models with improved performance. OmniGA code and datasets are available at www.cbrc.kaust.edu.sa/omniga/.

Citation
Magana-Mora A, Bajic VB (2017) OmniGA: Optimized Omnivariate Decision Trees for Generalizable Classification Models. Scientific Reports 7. Available: http://dx.doi.org/10.1038/s41598-017-04281-9.

Acknowledgements
This work was supported by King Abdullah University of Science and Technology (KAUST) through the baseline fund BAS/1/1606-01-01 of V.B.B. This research made use of the resources of CBRC at KAUST, Thuwal, Saudi Arabia.

Publisher
Springer Nature

Journal
Scientific Reports

DOI
10.1038/s41598-017-04281-9

PubMed ID
28634344

Additional Links
https://www.nature.com/articles/s41598-017-04281-9

Permanent link to this record