Abstract
Algorithms for solving certain nonconvex, nonsmooth, finite-sum optimization problems are proposed and tested. In particular, the algorithms are proposed and tested in the context of a transductive support vector machine (TSVM) problem formulation arising in semi-supervised machine learning. The common feature of all algorithms is that they employ an incremental quasi-Newton (IQN) strategy, specifically an incremental BFGS (IBFGS) strategy. One applies an IBFGS strategy to the problem directly, whereas the others apply an IBFGS strategy to a difference-of-convex reformulation, smoothed approximation, or (strongly) convex local approximation. Experiments show that all IBFGS approaches fare well in practice, and all outperform a state-of-the-art bundle method when solving TSVM problem instances.
Disclosure statement
No potential conflict of interest was reported by the author(s).
Notes
Additional information
Funding
Notes on contributors
Gulcin Dinc Yalcin
Gulcin Dinc Yalcin, Assistant Professor in the Department of Industrial Engineering at Eskisehir Technical University, conducts research on the design and analysis of algorithms and diverse applications of optimization from logistics to machine learning. She previously worked as a Postdoctoral Researcher in the Department of Industrial and Systems Engineering at Lehigh University.
Frank E. Curtis
Frank E. Curtis, Professor in the Department of Industrial and Systems Engineering at Lehigh University, conducts research on the design and analysis of algorithms for solving continuous optimization problems. He is a recipient of the Lagrange Prize in Continuous Optimization, jointly awarded by SIAM and the Mathematical Optimization Society, and is a recipient of the INFORMS Computing Society Prize.