2024-10-18 Friday Sign in CN

Activities
SGD-type Methods with Guaranteed Global Stability in Nonsmooth Nonconvex Optimization
Home - Activities
Reporter:
Nachuan Xiao, Doctor, National University of Singapore
Inviter:
Yaxiang Yuan, Professor
Subject:
SGD-type Methods with Guaranteed Global Stability in Nonsmooth Nonconvex Optimization
Time and place:
15:30-16:30 July 9(Tuesday), Z311
Abstract:

In this talk, we focus on providing convergence guarantees for variants of the stochastic subgradient descent (SGD) method in minimizing nonsmooth nonconvex functions. We first develop a general framework to establish global stability for general stochastic subgradient methods, where the corresponding differential inclusion admits a coercive Lyapunov function.  We prove that, with sufficiently small stepsizes and controlled noises, the iterates asymptotically stabilize around the stable set of its corresponding differential inclusion. Then we introduce a scheme for developing SGD-type methods with regularized update directions for the primal variables. Based on our developed framework, we prove the global stability of our proposed scheme under mild conditions. We further illustrate that our scheme yields variants of SGD-type methods, which enjoy guaranteed convergence in training nonsmooth neural networks. In particular, by employing the sign map to regularize the update directions, we propose a novel subgradient method named the Sign-map Regularized SGD method (SRSGD). Preliminary numerical experiments exhibit the high efficiency of SRSGD in training deep neural networks.