Next Article in Journal
Cross-Scale Interactions and Information Transfer
Previous Article in Journal
Entropy Methods in Guided Self-Organisation
Article Menu

Export Article

Open AccessArticle
Entropy 2014, 16(10), 5242-5262; doi:10.3390/e16105242

How to Mine Information from Each Instance to Extract an Abbreviated and Credible Logical Rule

1
Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, Jilin University, Changchun 130012, China
2
State Key Laboratory of Computer Science, Beijing 100080, China
3
College of Information Science and Engineering, Northeastern University, Shenyang City 110819, China
*
Author to whom correspondence should be addressed.
Received: 17 July 2014 / Revised: 28 August 2014 / Accepted: 28 September 2014 / Published: 9 October 2014
View Full-Text   |   Download PDF [529 KB, uploaded 24 February 2015]   |  

Abstract

Decision trees are particularly promising in symbolic representation and reasoning due to their comprehensible nature, which resembles the hierarchical process of human decision making. However, their drawbacks, caused by the single-tree structure,cannot be ignored. A rigid decision path may cause the majority class to overwhelm otherclass when dealing with imbalanced data sets, and pruning removes not only superfluousnodes, but also subtrees. The proposed learning algorithm, flexible hybrid decision forest(FHDF), mines information implicated in each instance to form logical rules on the basis of a chain rule of local mutual information, then forms different decision tree structures and decision forests later. The most credible decision path from the decision forest can be selected to make a prediction. Furthermore, functional dependencies (FDs), which are extracted from the whole data set based on association rule analysis, perform embedded attribute selection to remove nodes rather than subtrees, thus helping to achieve different levels of knowledge representation and improve model comprehension in the framework of semi-supervised learning. Naive Bayes replaces the leaf nodes at the bottom of the tree hierarchy, where the conditional independence assumption may hold. This technique reduces the potential for overfitting and overtraining and improves the prediction quality and generalization. Experimental results on UCI data sets demonstrate the efficacy of the proposed approach. View Full-Text
Keywords: decision forest; naive Bayes; functional dependency; semi-supervised learning decision forest; naive Bayes; functional dependency; semi-supervised learning
Figures

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Wang, L.; Sun, M.; Cao, C. How to Mine Information from Each Instance to Extract an Abbreviated and Credible Logical Rule. Entropy 2014, 16, 5242-5262.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top