In this work, we consider the low rank decomposition (SDPR) of general convex semidefinite programming problems (SDP) that contain both a positive semidefinite matrix and a nonnegative vector as variables. We develop a rank-support-adaptive feasible method to solve (SDPR). The method is able to escape from a saddle point to ensure its convergence to a global optimal solution for generic constraint vectors. We prove its global convergence and local linear convergence without assuming that the objective function is twice differentiable. The iteration complexity of our algorithm is better than previous results in the literature. In order to overcome the degeneracy issues of SDP problems, we develop two strategies based on random perturbation and dual refinement. These techniques enable us to solve some primal degenerate SDP problems efficiently, for example, Lovasz theta SDPs. Our work is a step forward in extending the application range of Riemannian optimization approaches for solving SDP problems. Numerical experiments are conducted to verify the efficiency and robustness of our method.
报告人简介:Toh Kim Chuan(卓金全),新加坡国立大学教授,新加坡科学院院士。1990年本科毕业于新加坡国立大学;1992年和1994年分别在新加坡国立大学、美国康奈尔大学获得硕士学位;1996年获得美国康奈尔大学应用数学博士学位(师从国际数值计算专家Lloyd N. Trefethen教授),1996年至今执教新加坡国立大学数学系。Toh教授是国际知名数值优化专家,主要致力于矩阵优化、二阶锥规划、凸规划等方面的算法设计、分析与实现。Toh教授及其合作者研制的软件被学术界和工业界广泛使用,如用于计算半定规划、二阶规划、线性规划的免费软件SDPT3,SDPNAL被广泛使用。Toh教授于2007年获新加坡国立大学杰出科学家奖,2010年在SIAM年会上做大会报告,并多次担任国际重要学术会议的组织成员。Toh教授荣获了2017年Farkas奖,2018年当选SIAM Fellow,现任优化著名期刊SIAM Journal on Optimization副主编,Mathematical Programming Series B副主编,担任Mathematical Programming Computation期刊区域主编, 以及多个国际学术期刊编委。