KAUST DepartmentApplied Mathematics and Computational Science Program
Computer, Electrical and Mathematical Science and Engineering (CEMSE) Division
Extensions of Dynamic Programming, Machine Learning and Discrete Optimization Research Group
Online Publication Date2021-09-16
Print Publication Date2021
Permanent link to this recordhttp://hdl.handle.net/10754/671321
MetadataShow full item record
AbstractIn this paper, we consider decision trees that use both conventional queries based on one attribute each and queries based on hypotheses about values of all attributes. Such decision trees are similar to ones studied in exact learning, where membership and equivalence queries are allowed. We present dynamic programming algorithms for minimization of the depth of above decision trees and discuss results of computer experiments on various data sets and randomly generated Boolean functions.
CitationAzad, M., Chikalov, I., Hussain, S., & Moshkov, M. (2021). Minimizing Depth of Decision Trees with Hypotheses. Lecture Notes in Computer Science, 123–133. doi:10.1007/978-3-030-87334-9_11
SponsorsResearch reported in this publication was supported by King Abdullah University of Science and Technology (KAUST). The authors are greatly indebted to anonymous reviewers for useful comments and suggestions.
PublisherSpringer International Publishing
Conference/Event nameInternational Joint Conference on Rough Sets 2021