Next Article in Journal
Discovering Potential Correlations via Hypercontractivity
Previous Article in Journal
Challenging Recently Published Parameter Sets for Entropy Measures in Risk Prediction for End-Stage Renal Disease Patients
Article Menu
Issue 11 (November) cover image

Export Article

Open AccessArticle
Entropy 2017, 19(11), 583;

Instance Selection for Classifier Performance Estimation in Meta Learning

Department of Applied Informatics, Silesian University of Technology, 44-100 Gliwice, Poland
Received: 20 September 2017 / Revised: 22 October 2017 / Accepted: 23 October 2017 / Published: 1 November 2017
(This article belongs to the Section Complexity)
Full-Text   |   PDF [24385 KB, uploaded 3 November 2017]   |  


Building an accurate prediction model is challenging and requires appropriate model selection. This process is very time consuming but can be accelerated with meta-learning–automatic model recommendation by estimating the performances of given prediction models without training them. Meta-learning utilizes metadata extracted from the dataset to effectively estimate the accuracy of the model in question. To achieve that goal, metadata descriptors must be gathered efficiently and must be informative to allow the precise estimation of prediction accuracy. In this paper, a new type of metadata descriptors is analyzed. These descriptors are based on the compression level obtained from the instance selection methods at the data-preprocessing stage. To verify their suitability, two types of experiments on real-world datasets have been conducted. In the first one, 11 instance selection methods were examined in order to validate the compression–accuracy relation for three classifiers: k-nearest neighbors (kNN), support vector machine (SVM), and random forest. From this analysis, two methods are recommended (instance-based learning type 2 (IB2), and edited nearest neighbor (ENN)) which are then compared with the state-of-the-art metaset descriptors. The obtained results confirm that the two suggested compression-based meta-features help to predict accuracy of the base model much more accurately than the state-of-the-art solution. View Full-Text
Keywords: machine learning; classification; instance selection; meta-learning; accuracy estimation machine learning; classification; instance selection; meta-learning; accuracy estimation

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

Supplementary material


Share & Cite This Article

MDPI and ACS Style

Blachnik, M. Instance Selection for Classifier Performance Estimation in Meta Learning. Entropy 2017, 19, 583.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top