2024-11-24 Sunday Sign in CN

Activities
Adaptive Methods for Nonsmooth Optimization with Convergence Guarantees
Home - Activities
Reporter:
Nachuan Xiao, Doctor, National University of Singapore
Inviter:
Yaxiang Yuan, Academician
Subject:
Adaptive Methods for Nonsmooth Optimization with Convergence Guarantees
Time and place:
10:00-11:00 April 13 (Thursday), Z311
Abstract:

In this talk, we present a comprehensive study on the convergence properties of adaptive methods for nonsmooth optimization, particularly in the context of training nonsmooth neural networks. We introduce a novel two-time-scale framework that assigns distinct stepsizes to updating directions and evaluating noises, and prove its convergence properties. We show that our proposed framework enclose various popular adaptive methods, hence provides convergence guarantees for these methods under mild assumptions in training nonsmooth neural networks. Moreover, we develop stochastic subgradient methods with gradient clipping techniques for training nonsmooth neural networks with heavy-tailed noises. Through our proposed framework, we show that our proposed methods converge when the evaluating noises are only assumed to be integrable. Extensive numerical experiments demonstrate the high efficiency and robustness of our proposed methods.