Next Article in Journal
Multiscale Entropy Quantifies the Differential Effect of the Medium Embodiment on Older Adults Prefrontal Cortex during the Story Comprehension: A Comparative Analysis
Previous Article in Journal
Magnetotelluric Signal-Noise Identification and Separation Based on ApEn-MSE and StOMP
Article Menu
Issue 2 (February) cover image

Export Article

Open AccessArticle
Entropy 2019, 21(2), 198; https://doi.org/10.3390/e21020198

Attribute Selection Based on Constraint Gain and Depth Optimal for a Decision Tree

1
School of Computer Science, Huainan Normal University, Huainan 232038, China
2
School of Computer and Information, Hefei University and Technology, Hefei 230009, China
*
Author to whom correspondence should be addressed.
Received: 2 January 2019 / Revised: 8 February 2019 / Accepted: 13 February 2019 / Published: 19 February 2019
Full-Text   |   PDF [3224 KB, uploaded 19 February 2019]   |  

Abstract

Uncertainty evaluation based on statistical probabilistic information entropy is a commonly used mechanism for a heuristic method construction of decision tree learning. The entropy kernel potentially links its deviation and decision tree classification performance. This paper presents a decision tree learning algorithm based on constrained gain and depth induction optimization. Firstly, the calculation and analysis of single- and multi-value event uncertainty distributions of information entropy is followed by an enhanced property of single-value event entropy kernel and multi-value event entropy peaks as well as a reciprocal relationship between peak location and the number of possible events. Secondly, this study proposed an estimated method for information entropy whose entropy kernel is replaced with a peak-shift sine function to establish a decision tree learning (CGDT) algorithm on the basis of constraint gain. Finally, by combining branch convergence and fan-out indices under an inductive depth of a decision tree, we built a constraint gained and depth inductive improved decision tree (CGDIDT) learning algorithm. Results show the benefits of the CGDT and CGDIDT algorithms. View Full-Text
Keywords: decision tree; attribute selection measure; entropy; constraint entropy; constraint gain; branch convergence and fan-out decision tree; attribute selection measure; entropy; constraint entropy; constraint gain; branch convergence and fan-out
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Sun, H.; Hu, X.; Zhang, Y. Attribute Selection Based on Constraint Gain and Depth Optimal for a Decision Tree. Entropy 2019, 21, 198.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top