145
Views
4
CrossRef citations to date
0
Altmetric
Articles

Exploiting aggregate sparsity in second-order cone relaxations for quadratic constrained quadratic programming problems

& ORCID Icon
Pages 753-771 | Received 05 Nov 2019, Accepted 18 Sep 2020, Published online: 29 Sep 2020
 

Abstract

Among many approaches to increase the computational efficiency of semidefinite programming (SDP) relaxation for nonconvex quadratic constrained quadratic programming problems (QCQPs), exploiting the aggregate sparsity of the data matrices in the SDP by Fukuda et al. [Exploiting sparsity in semidefinite programming via matrix completion I: General framework, SIAM J. Optim. 11(3) (2001), pp. 647–674] and second-order cone programming (SOCP) relaxation have been popular. In this paper, we exploit the aggregate sparsity of SOCP relaxation of nonconvex QCQPs. Specifically, we prove that exploiting the aggregate sparsity reduces the number of second-order cones in the SOCP relaxation, and that we can simplify the matrix completion procedure by Fukuda et al. in both primal and dual of the SOCP relaxation problem without losing the max-determinant property. For numerical experiments, nonconvex QCQPs from the lattice graph and pooling problem are tested as their SOCP relaxations provide the same optimal value as the SDP relaxations. We demonstrate that exploiting the aggregate sparsity improves the computational efficiency of the SOCP relaxation for the same objective value as the SDP relaxation, thus much larger problems can be handled by the proposed SOCP relaxation than the SDP relaxation.

AMS Classifications:

Acknowledgments

This research was conducted while the first author was a research student at the Department of Mathematical and Computing Science, Tokyo Institute of Technology.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This research was partially supported by Grant-in-Aid for Scientific Research (B) [grant number 18K11176].

Notes on contributors

Heejune Sheen

Heejune Sheen is a master student at H. Milton Stewart School of Industrial Systems and Engineering, Georgia Institute of Technology. He graduated from the Korea Advanced Institute of Science and Technology (KAIST) with B.S. in Mathematics. His research interests include statistics, machine learning, conic optimization, and convex optimization.

Makoto Yamashita

Makoto Yamashita is a professor at the Department of Mathematical and Computing Science in Tokyo Institute of Technology. He received his doctoral degree in 2004 at Tokyo Institute of Technology. His research interests include conic optimization, its applications and numerical methods.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 1,330.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.