Convex-concave minimax optimization problems refer to the optimization problems where the objective function and constraints are convex for one variable and concave for another variable. Firstly, we discuss optimality conditions and develop a proximal gradient multi-step ascent decent method (PGmsAD) as a practical numerical algorithm for nonsmooth convex-concave minimax problems with joint linear constraints. We apply PGmsAD to generalized absolute value equations, generalized linear projection equations and linear regression problems and report the efficiency of PGmsAD on large-scale optimization. Secondly, we present a stochastic approximation proximal subgradient (SAPS) method for stochastic convex-concave minimax optimization. To extend the algorithm for solving stochastic convex-concave minimax optimization problems arising from stochastic convex conic optimization problems, we propose a linearized stochastic approximation augmented Lagrange (LSAAL) method. Preliminary numerical results demonstrate the effect of the SAPS and LSAAL methods.
Short Bio:王嘉妮,主要从事非线性优化问题的随机算法和带有约束结构的博弈最优性理论 与算法研究。其在JCM,JSC等运筹学领域高水平期刊上发表文章多篇,主持了一项军科创新院项目。