145
Views
4
CrossRef citations to date
0
Altmetric
Articles

Exploiting aggregate sparsity in second-order cone relaxations for quadratic constrained quadratic programming problems

& ORCID Icon
Pages 753-771 | Received 05 Nov 2019, Accepted 18 Sep 2020, Published online: 29 Sep 2020
 

Abstract

Among many approaches to increase the computational efficiency of semidefinite programming (SDP) relaxation for nonconvex quadratic constrained quadratic programming problems (QCQPs), exploiting the aggregate sparsity of the data matrices in the SDP by Fukuda et al. [Exploiting sparsity in semidefinite programming via matrix completion I: General framework, SIAM J. Optim. 11(3) (2001), pp. 647–674] and second-order cone programming (SOCP) relaxation have been popular. In this paper, we exploit the aggregate sparsity of SOCP relaxation of nonconvex QCQPs. Specifically, we prove that exploiting the aggregate sparsity reduces the number of second-order cones in the SOCP relaxation, and that we can simplify the matrix completion procedure by Fukuda et al. in both primal and dual of the SOCP relaxation problem without losing the max-determinant property. For numerical experiments, nonconvex QCQPs from the lattice graph and pooling problem are tested as their SOCP relaxations provide the same optimal value as the SDP relaxations. We demonstrate that exploiting the aggregate sparsity improves the computational efficiency of the SOCP relaxation for the same objective value as the SDP relaxation, thus much larger problems can be handled by the proposed SOCP relaxation than the SDP relaxation.

AMS Classifications:

Acknowledgments

This research was conducted while the first author was a research student at the Department of Mathematical and Computing Science, Tokyo Institute of Technology.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This research was partially supported by Grant-in-Aid for Scientific Research (B) [grant number 18K11176].

Notes on contributors

Heejune Sheen

Heejune Sheen is a master student at H. Milton Stewart School of Industrial Systems and Engineering, Georgia Institute of Technology. He graduated from the Korea Advanced Institute of Science and Technology (KAIST) with B.S. in Mathematics. His research interests include statistics, machine learning, conic optimization, and convex optimization.

Makoto Yamashita

Makoto Yamashita is a professor at the Department of Mathematical and Computing Science in Tokyo Institute of Technology. He received his doctoral degree in 2004 at Tokyo Institute of Technology. His research interests include conic optimization, its applications and numerical methods.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.