Abstract
Class imbalance remains a major challenge in machine learning, often degrading model performance by making it difficult to learn meaningful patterns from the underrepresented (minority) class. Traditional classifiers tend to be biased toward the majority class, leading to poor generalization on minority class instances. Moreover, the distribution and representation of minority class data are often poorly understood and inadequately handled during model training. To address these challenges, we propose a novel method called Dynamic Boost-Wrapper with Backpropagation and Monitored-Modified Sigmoid Deep Neural Network (DM2DN2). This approach integrates dynamic boosting with a wrapper-based strategy and a modified sigmoid activation function to enhance minority class learning. The dynamic boosting component rebalances the training data using a bootstrap approach, while the backpropagation-based deep neural network is guided by a monitored and modified sigmoid activation to improve convergence and representation. After training, the model uses learned class distributions (not adversarial training) to synthesize balanced class data. DM2DN2 is not GAN-based but uses bootstrap-driven synthesis. Four different KEEL datasets are used for experimentation to illustrates the performance of DM2DN2 with other state-of-the-art techniques. Experimental results demonstrate that the proposed method significantly outperforms several state-of-the-art techniques in handling imbalanced datasets, particularly in terms of robustness and classification accuracy which is more than 97%.
Get full access to this article
View all access options for this article.
