459
Views
6
CrossRef citations to date
0
Altmetric
Dimensionality Reduction, Regularization, and Variable Selection

MIP-BOOST: Efficient and Effective L0 Feature Selection for Linear Regression

ORCID Icon, &
Pages 566-577 | Received 19 Sep 2019, Accepted 14 Oct 2020, Published online: 04 Jan 2021
 

Abstract

Recent advances in mathematical programming have made mixed integer optimization a competitive alternative to popular regularization methods for selecting features in regression problems. The approach exhibits unquestionable foundational appeal and versatility, but also poses important challenges. Here, we propose MIP-BOOST, a revision of standard mixed integer programming feature selection that reduces the computational burden of tuning the critical sparsity bound parameter and improves performance in the presence of feature collinearity and of signals that vary in nature and strength. The final outcome is a more efficient and effective L0 feature selection method for applications of realistic size and complexity, grounded on rigorous cross-validation tuning and exact optimization of the associated mixed integer program. Computational viability and improved performance in realistic scenarios is achieved through three independent but synergistic proposals. Supplementary materials including additional results, pseudocode, and computer code are available online.

Supplementary Materials

Supplement-MIP-BOOST.pdf Contains pseudo code, additional simulation results, and a convergence proof sketch for bisection with feelers.

MIPBOOSTex.jl: Julia code running a simulated example of the methods described in the article. Contains functions for each component for general use.

Acknowledgments

We thank Matthew Reimherr for useful discussions and comments.

Notes

1 Calculations were performed on the Roar Supercomputer at Penn State, using the basic memory option on the ACI-B cluster with an Intel Xeon 24 core processor at 2.2 GHz and 128 GB of RAM. The multi-thread option in Gurobi was limited to a maximum of 5 threads for consistency across settings.

2 For SNR 10, MIP-BOOST was run with a more conservative threshold set for BF; a quick run of the LASSO selecting a very large number of features can inexpensively diagnose the need to implement a restrictive BF. However, we stress that BF used in MIP-BOOST does not suffer from the same lack of specificity that hinders LASSO at high SNRs. When the SNR increases the true k0 becomes in fact easier to identify for MIP; it is simply that the BF search may be pushed toward the right of the k range if seemingly large differences in CVMSE are not “ignored” by setting a stricter threshold. In , the elbow at 10 is clearly denoted when SNR=10, but the strong signal leads to distinct drops in the CVMSE also at large k values. A more restrictive threshold easily improves solution quality. In contrast, using a more conservative tuning for LASSO (e.g., the parsimonious LSD), still fails to increase specificity.

Additional information

Funding

This work was partially funded by the NIH B2D2K training grant and the Huck Institutes of the Life Sciences of Penn State, and by NSF grant DMS-1407639. Computation was performed on the Roar Supercomputer at Penn State University.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 180.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.