In recent years, the proximal gradient method and its variants have been generalized to Riemannian manifolds for solving optimization problems in the form of f+g, where f is continuously differentiable and $g$ may be nonsmooth. In this talk, we discuss Riemannian proximal gradient methods and variants, including Riemannian proximal gradient methods, inexact Riemannian proximal gradient methods, and a Riemannian proximal Newton method. Their global convergence results and local convergence rates are given. Numerical experiments are used to demonstrate the performance of the proposed method.