dc.description.abstract | In our daily life, most of the datasets possess the class imbalance problem, in which one class contains a very large number of data samples whereas another class for a very small number of data samples. On example is bankruptcy information, suffering from rare diseases, due to accidental casualties and so on. In the process of training a classifier, the traditional binary classification algorithms will generate prediction bias because of class imbalanced datasets, and the results also tend to favor the majority class samples. In recent years, a considerable number of scholars raised many solutions for solving the class imbalanced problem.
In this study, different from related works that proposing novel algorithms to enhance the performances of existing classification techniques, we focus on finding out the best baseline classifier for the class imbalance domain problem. The finding of this study is able to provide the guideline for future research to compare their novel algorithms to the identified baseline classifier.
The experiments are based on 44 various domain datasets containing different imbalance ratios and three popular classifiers, i.e. J48, MLP, and SVM are constructed and compared. Moreover, classifier ensembles by the bagging and boosting method are also developed. The results show that the bagging based MLP classifier ensembles perform the best in terms of the AUC rate. | en_US |