192
Views
0
CrossRef citations to date
0
Altmetric
High-Dimensional and Big Data

Accelerated Componentwise Gradient Boosting Using Efficient Data Representation and Momentum-Based Optimization

ORCID Icon, &
Pages 631-641 | Received 08 Oct 2021, Accepted 08 Aug 2022, Published online: 11 Oct 2022
 

Abstract

Componentwise boosting (CWB), also known as model-based boosting, is a variant of gradient boosting that builds on additive models as base learners to ensure interpretability. CWB is thus often used in research areas where models are employed as tools to explain relationships in data. One downside of CWB is its computational complexity in terms of memory and runtime. In this article, we propose two techniques to overcome these issues without losing the properties of CWB: feature discretization of numerical features and incorporating Nesterov momentum into functional gradient descent. As the latter can be prone to early overfitting, we also propose a hybrid approach that prevents a possibly diverging gradient descent routine while ensuring faster convergence. Our adaptions improve vanilla CWB by reducing memory consumption and speeding up the computation time per iteration (through feature discretization) while also enabling CWB learn faster and hence to require fewer iterations in total using momentum. We perform extensive benchmarks on multiple simulated and real-world datasets to demonstrate the improvements in runtime and memory consumption while maintaining state-of-the-art estimation and prediction performance.

Disclosure Statement

The authors report there are no competing interests to declare.

Supplementary Material

Appendix:Descriptions of possible categorical feature representations with a short comparison w.r.t. runtime and memory consumption as well as class selection properties in the presence of noise. The Appendix further contains empirical validation of the computational complexity estimates as given in Section 2.4 and 3.1.3. The appendix also contains a figure for the full benchmark.

Source code of compboost:github.com/schalkdaniel/compboost (Commit tag of the snapshot used in this article: c68e8fb32aea862750991260d243cdca1d3ebd0e)

Benchmark source code: https://github.com/schalkdaniel/cacb-paper-bmr.

Benchmark Docker:Docker image with pre-installed packages to run the benchmark and access results for manual inspection: hub.docker.com/repository/docker/schalkdaniel/cacb-paper-bmr.

Additional information

Funding

This work was supported by the German Federal Ministry of Education and Research (BMBF) under Grant No. 01IS18036A and Federal Ministry for Research and Technology (BMFT) under Grant FKZ: 01ZZ1804C (DIFUTURE, MII). The authors of this work take full responsibilities for its content.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 180.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.