Abstract
The optimum and at least one optimizing point for convex nonlinear programs can be approximated well by the solution to a linear program (a fact long used in branch and bound algorithms). In more general problems, we can identify subspaces of ‘non-convex variables’ such that, if these variables have sufficiently small ranges, the optimum and at least one optimizing point can be approximated well by the solution of a single linear program. If these subspaces are low-dimensional, this suggests subdividing the variables in the subspace a priori, then producing and solving a fixed, known number of linear programs to obtain an approximation to the solution. The total amount of computation is much more predictable than that required to complete a branch and bound algorithm, and the scheme is ‘embarrassingly parallel’, with little need for either communication or load balancing. We compare such a non-adaptive scheme experimentally to our GlobSol branch and bound implementation, on those problems from the COCONUT project Lib1 test set with non-convex subspaces of dimension four or less, and we discuss potential alterations to both the non-adaptive scheme and our branch and bound process that might change the scope of applicability.
Acknowledgements
We thank the referees for their thorough and detailed comments and suggestions. We also thank the editor as well for his patience while we corrected bugs in the code and thoroughly analysed anomalies in the results.
Notes
1 This does not include statistical or heuristic methods, exemplified by evolutionary algorithms, simulated annealing, and the like.
2 An additional important technique is constraint propagation, while feasibility analyses, analysis of Kuhn–Tucker criticality, and various methods based on interval analysis also play a role. However, these will not be the primary focus of this work.
3 Assembling the results requires communication, but this involves much less processing than each solution process. We elaborate later in this work.
4 For example, by forming linear relaxations, by using a solver for convex problems where appropriate, etc.
5 If using linear relaxations, this may be done with the simple and inexpensive technique in Citation18. If using a solver for convex problems, this can possibly be done with an astutely implemented interval Newton method applied over a small box constructed about the approximate solution.
6 Memory was not an issue in GlobSol’s branch and bound algorithm, but was a limiting factor in preliminary experiments with Algorithm 1 when the problem had a larger reduced space dimension or a larger M was used
7 For the convex problems, this ratio is 1, since only one box total was considered.