152
Views
1
CrossRef citations to date
0
Altmetric
Original Articles

Stratified Normalization LogitBoost for Two-Class Unbalanced Data Classification

, , &
Pages 1587-1593 | Received 27 Jun 2010, Accepted 30 Mar 2011, Published online: 16 Aug 2011
 

Abstract

The research on unbalanced data classification is a hot topic in recent years. LogitBoost algorithm is an adaptive algorithm that can get much higher prediction precision. But in the face of unbalanced data, this algorithm could produce a large minority class prediction error. In this article, we propose an improved LogitBoost algorithm named BLogitBoost, based on a stratified normalization method which normalizes within class sampling probability first, then normalizes between classes. The experiments on simulation data and empirical data show that the new algorithm can reduce the minority class prediction error significantly.

Mathematics Subject Classification:

Acknowledgments

The research is supported by the Fundamental Research Funds for the Central Universities, and the Research Funds of Renmin University of China through No. 2011030017.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 1,090.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.