166
Views
1
CrossRef citations to date
0
Altmetric
Original Articles

Ensemble of fast learning stochastic gradient boosting

ORCID Icon, ORCID Icon &
Pages 40-52 | Received 18 Oct 2017, Accepted 12 Jul 2019, Published online: 25 Jul 2019
 

Abstract

Boosting is one of the most popular and powerful learning algorithms. However, due to its sequential nature in model fitting, the computational time of boosting algorithm can be prohibitive for big data analysis. In this paper, we proposed a parallel framework for boosting algorithm, called Ensemble of Fast Learning Stochastic Gradient Boosting (EFLSGB). The proposed EFLSGB is well suited for parallel execution, and therefore, can substantially reduce the computational time. Analysis of simulated and real datasets demonstrates that EFLSGB achieves highly competitive prediction accuracy in comparison with gradient tree boosting.

Acknowledgment

Portions of this research were conducted with high performance computational resources provided by the Louisiana Optical Network Infrastructure (http://www.loni.org).

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 1,090.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.