Abstract
In this paper, we develop an exact reformulation and a deterministic approximation for distributionally robust joint chance-constrained programmings with a general class of convex uncertain constraints under data-driven Wasserstein ambiguity sets. It is known that robust chance constraints can be conservatively approximated by worst-case conditional value-at-risk (CVaR) constraints. It is shown that the proposed worst-case CVaR approximation model can be reformulated as an optimization problem involving biconvex constraints for joint DRCCP. This approximation is essentially exact under certain conditions. We derive a convex relaxation of this approximation model by constructing new decision variables which allows us to eliminate biconvex terms. Specifically, when the constraint function is affine in both the decision variable and the uncertainty, the resulting approximation model is equivalent to a tractable mixed-integer convex reformulation for joint binary DRCCP. Numerical results illustrate the computational effectiveness and superiority of the proposed formulations.
Disclosure statement
No potential conflict of interest was reported by the author(s).
Data availability statement
Some or all data, models, or code generated or used during the study are available from the first author of the paper by request.
Additional information
Funding
Notes on contributors
Yining Gu
Yining Gu, received the M.S. degree from the School of Mathematics, Tianjin University, Tianjin, China, in 2018. She is currently pursuing the Ph.D. degree with the School of Mathematics, Shanghai University of Finance and Economics, Shanghai, China. Her research interests include stochastic optimization, distributionally robust optimization, and their applications.
Yanjun Wang
Yanjun Wang, received the Ph.D. degree from the School of Science, Xi'an Jiaotong University, Xi'an, China, in 2004. She is currently a Professor with the School of Mathematics, Shanghai University of Finance and Economics, Shanghai, China. Her research interests include stochastic optimization, distributionally robust optimization, non-convex optimization, global optimization, and their applications.