In this talk, we mainly introduce two randomized first-order methods for convex finite-sum optimization. The first one is the random primal-dual gradient (RPDG) method, which incorporates random block decomposition into the primal-dual method for deterministic convex optimization. The second one is the random gradient extrapolation method (RGEM), which employs randomized gradient evaluations into the gradient extrapolation method for deterministic convex optimization. Under a judicious selection of parameters, we can prove that both of these methods can obtain the lowest iteration complexity bound for randomized methods, while performing fewer gradient evaluations than optimal deterministic first-order methods under certain circumstances.