2024年11月23日 星期六 登录 EN

学术活动
SGD-type Methods with Guaranteed Global Stability in Nonsmooth Nonconvex Optimization
首页 - 学术活动
报告人:
Nachuan Xiao, Doctor, National University of Singapore
邀请人:
Yaxiang Yuan, Professor
题目:
SGD-type Methods with Guaranteed Global Stability in Nonsmooth Nonconvex Optimization
时间地点:
15:30-16:30 July 9(Tuesday), Z311
摘要:

In this talk, we focus on providing convergence guarantees for variants of the stochastic subgradient descent (SGD) method in minimizing nonsmooth nonconvex functions. We first develop a general framework to establish global stability for general stochastic subgradient methods, where the corresponding differential inclusion admits a coercive Lyapunov function.  We prove that, with sufficiently small stepsizes and controlled noises, the iterates asymptotically stabilize around the stable set of its corresponding differential inclusion. Then we introduce a scheme for developing SGD-type methods with regularized update directions for the primal variables. Based on our developed framework, we prove the global stability of our proposed scheme under mild conditions. We further illustrate that our scheme yields variants of SGD-type methods, which enjoy guaranteed convergence in training nonsmooth neural networks. In particular, by employing the sign map to regularize the update directions, we propose a novel subgradient method named the Sign-map Regularized SGD method (SRSGD). Preliminary numerical experiments exhibit the high efficiency of SRSGD in training deep neural networks.