• Login
    View Item 
    •   Home
    • Research
    • Articles
    • View Item
    •   Home
    • Research
    • Articles
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of KAUSTCommunitiesIssue DateSubmit DateThis CollectionIssue DateSubmit Date

    My Account

    Login

    Quick Links

    Open Access PolicyORCID LibguideTheses and Dissertations LibguideSubmit an Item

    Statistics

    Display statistics

    A greedy algorithm for construction of decision trees for tables with many-valued decisions - A comparative study

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Type
    Article
    Authors
    Azad, Mohammad cc
    Chikalov, Igor
    Moshkov, Mikhail cc
    Zielosko, Beata
    KAUST Department
    Applied Mathematics and Computational Science Program
    Computational Bioscience Research Center (CBRC)
    Computer Science Program
    Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
    Extensions of Dynamic Programming, Machine Learning and Discrete Optimization Research Group
    Office of the VP
    Date
    2013-11-25
    Permanent link to this record
    http://hdl.handle.net/10754/564819
    
    Metadata
    Show full item record
    Abstract
    In the paper, we study a greedy algorithm for construction of decision trees. This algorithm is applicable to decision tables with many-valued decisions where each row is labeled with a set of decisions. For a given row, we should find a decision from the set attached to this row. Experimental results for data sets from UCI Machine Learning Repository and randomly generated tables are presented. We make a comparative study of the depth and average depth of the constructed decision trees for proposed approach and approach based on generalized decision. The obtained results show that the proposed approach can be useful from the point of view of knowledge representation and algorithm construction.
    Publisher
    IOS Press
    Journal
    Fundamenta Informaticae
    DOI
    10.3233/FI-2013-929
    ae974a485f413a2113503eed53cd6c53
    10.3233/FI-2013-929
    Scopus Count
    Collections
    Articles; Applied Mathematics and Computational Science Program; Computer Science Program; Computational Bioscience Research Center (CBRC); Computer, Electrical and Mathematical Science and Engineering (CEMSE) Division

    entitlement

     

    Related items

    Showing items related by title, author, creator and subject.

    • Thumbnail

      Greedy Algorithm for the Construction of Approximate Decision Rules for Decision Tables with Many-Valued Decisions

      Azad, Mohammad; Moshkov, Mikhail; Zielosko, Beata (Lecture Notes in Computer Science, Springer Nature, 2016-10-21) [Book Chapter]
      The paper is devoted to the study of a greedy algorithm for construction of approximate decision rules. This algorithm is applicable to decision tables with many-valued decisions where each row is labeled with a set of decisions. For a given row, we should find a decision from the set attached to this row. We consider bounds on the precision of this algorithm relative to the length of rules. To illustrate proposed approach we study a problem of recognition of labels of points in the plain. This paper contains also results of experiments with modified decision tables from UCI Machine Learning Repository.
    • Thumbnail

      Decision and Inhibitory Trees for Decision Tables with Many-Valued Decisions

      Azad, Mohammad (2018-06-06) [Dissertation]
      Advisor: Moshkov, Mikhail
      Committee members: Bajic, Vladimir B.; Zhang, Xiangliang; Boros, Endre
      Decision trees are one of the most commonly used tools in decision analysis, knowledge representation, machine learning, etc., for its simplicity and interpretability. We consider an extension of dynamic programming approach to process the whole set of decision trees for the given decision table which was previously only attainable by brute-force algorithms. We study decision tables with many-valued decisions (each row may contain multiple decisions) because they are more reasonable models of data in many cases. To address this problem in a broad sense, we consider not only decision trees but also inhibitory trees where terminal nodes are labeled with “̸= decision”. Inhibitory trees can sometimes describe more knowledge from datasets than decision trees. As for cost functions, we consider depth or average depth to minimize time complexity of trees, and the number of nodes or the number of the terminal, or nonterminal nodes to minimize the space complexity of trees. We investigate the multi-stage optimization of trees relative to some cost functions, and also the possibility to describe the whole set of strictly optimal trees. Furthermore, we study the bi-criteria optimization cost vs. cost and cost vs. uncertainty for decision trees, and cost vs. cost and cost vs. completeness for inhibitory trees. The most interesting application of the developed technique is the creation of multi-pruning and restricted multi-pruning approaches which are useful for knowledge representation and prediction. The experimental results show that decision trees constructed by these approaches can often outperform the decision trees constructed by the CART algorithm. Another application includes the comparison of 12 greedy heuristics for single- and bi-criteria optimization (cost vs. cost) of trees. We also study the three approaches (decision tables with many-valued decisions, decision tables with most common decisions, and decision tables with generalized decisions) to handle inconsistency of decision tables. We also analyze the time complexity of decision and inhibitory trees over arbitrary sets of attributes represented by information systems in the frameworks of local (when we can use in trees only attributes from problem description) and global (when we can use in trees arbitrary attributes from the information system) approaches.
    • Thumbnail

      Decision and Inhibitory Trees and Rules for Decision Tables with Many-valued Decisions

      Alsolami, Fawaz; Azad, Mohammad; Chikalov, Igor; Moshkov, Mikhail (Springer Nature, 2019-05-21) [Book]
      The results presented here (including the assessment of a new tool – inhibitory trees) offer valuable tools for researchers in the areas of data mining, knowledge discovery, and machine learning, especially those whose work involves decision tables with many-valued decisions. The authors consider various examples of problems and corresponding decision tables with many-valued decisions, discuss the difference between decision and inhibitory trees and rules, and develop tools for their analysis and design. Applications include the study of totally optimal (optimal in relation to a number of criteria simultaneously) decision and inhibitory trees and rules; the comparison of greedy heuristics for tree and rule construction as single-criterion and bi-criteria optimization algorithms; and the development of a restricted multi-pruning approach used in classification and knowledge representation.
    DSpace software copyright © 2002-2023  DuraSpace
    Quick Guide | Contact Us | KAUST University Library
    Open Repository is a service hosted by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items. For anonymous users the allowed maximum amount is 50 search results.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.