Abstract
The research on unbalanced data classification is a hot topic in recent years. LogitBoost algorithm is an adaptive algorithm that can get much higher prediction precision. But in the face of unbalanced data, this algorithm could produce a large minority class prediction error. In this article, we propose an improved LogitBoost algorithm named BLogitBoost, based on a stratified normalization method which normalizes within class sampling probability first, then normalizes between classes. The experiments on simulation data and empirical data show that the new algorithm can reduce the minority class prediction error significantly.
Mathematics Subject Classification:
Acknowledgments
The research is supported by the Fundamental Research Funds for the Central Universities, and the Research Funds of Renmin University of China through No. 2011030017.