152
Views
1
CrossRef citations to date
0
Altmetric
Original Articles

Stratified Normalization LogitBoost for Two-Class Unbalanced Data Classification

, , &
Pages 1587-1593 | Received 27 Jun 2010, Accepted 30 Mar 2011, Published online: 16 Aug 2011
 

Abstract

The research on unbalanced data classification is a hot topic in recent years. LogitBoost algorithm is an adaptive algorithm that can get much higher prediction precision. But in the face of unbalanced data, this algorithm could produce a large minority class prediction error. In this article, we propose an improved LogitBoost algorithm named BLogitBoost, based on a stratified normalization method which normalizes within class sampling probability first, then normalizes between classes. The experiments on simulation data and empirical data show that the new algorithm can reduce the minority class prediction error significantly.

Mathematics Subject Classification:

Acknowledgments

The research is supported by the Fundamental Research Funds for the Central Universities, and the Research Funds of Renmin University of China through No. 2011030017.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.