Imbalanced Learning Based on Data-Partition and SMOTE
AbstractClassification of data with imbalanced class distribution has encountered a significant drawback by most conventional classification learning methods which assume a relatively balanced class distribution. This paper proposes a novel classification method based on data-partition and SMOTE for imbalanced learning. The proposed method differs from conventional ones in both the learning and prediction stages. For the learning stage, the proposed method uses the following three steps to learn a class-imbalance oriented model: (1) partitioning the majority class into several clusters using data partition methods such as K-Means, (2) constructing a novel training set using SMOTE on each data set obtained by merging each cluster with the minority class, and (3) learning a classification model on each training set using convention classification learning methods including decision tree, SVM and neural network. Therefore, a classifier repository consisting of several classification models is constructed. With respect to the prediction stage, for a given example to be classified, the proposed method uses the partition model constructed in the learning stage to select a model from the classifier repository to predict the example. Comprehensive experiments on KEEL data sets show that the proposed method outperforms some other existing methods on evaluation measures of recall, g-mean, f-measure and AUC. View Full-Text
Share & Cite This Article
Guo, H.; Zhou, J.; Wu, C.-A. Imbalanced Learning Based on Data-Partition and SMOTE. Information 2018, 9, 238.
Guo H, Zhou J, Wu C-A. Imbalanced Learning Based on Data-Partition and SMOTE. Information. 2018; 9(9):238.Chicago/Turabian Style
Guo, Huaping; Zhou, Jun; Wu, Chang-An. 2018. "Imbalanced Learning Based on Data-Partition and SMOTE." Information 9, no. 9: 238.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.