- Article
Asymmetric Feature Weighting for Diversity-Enhanced Random Forests
- Ye Eun Kim,
- Seoung Yun Kim and
- Hyunjoong Kim
Random Forest (RF) is one of the most widely used ensemble learning algorithms for classification and regression tasks. Its performance, however, depends not only on the accuracy of individual trees but also on the diversity among them. This study proposes a novel ensemble method, Heterogeneous Random Forest (HRF), which enhances ensemble diversity through adaptive and asymmetric feature weighting. Unlike conventional RF that treats all features equally during tree construction, HRF dynamically reduces the sampling probability of features that have been frequently selected—particularly those appearing near the root nodes of previous trees. This mechanism discourages repetitive feature usage and encourages a more balanced and heterogeneous ensemble structure. Simulation studies demonstrate that HRF effectively mitigates feature selection bias, increases structural diversity, and improves classification accuracy, particularly on datasets with low noise ratios and diverse feature cardinalities. Comprehensive experiments on 52 benchmark datasets further confirm that HRF achieves the highest overall performance and significant accuracy gains compared to standard ensemble methods. These results highlight that asymmetric feature weighting provides a simple yet powerful mechanism for promoting diversity and enhancing generalization in ensemble learning.
1 January 2026




