Stochastic methods are widely used in deep learning. In this talk, we first review the state-of-the-art methods such as KFAC. Then we present a structured stochastic quasi-Newton method and a sketchy empirical natural gradient method. Numerical results on deep convolution networks illustrate that our methods are quite competitive to SGD and KFAC.
欢迎大家参加!