High-order tensor methods for solving both convex and nonconvex optimization problems have recently generated significant research interest, due in part to the natural way in which higher derivatives can be incorporated into adaptive regularization frameworks, leading to algorithms with optimal global rates of convergence and local rates that are faster than Newton's method. These algorithms crucially rely on repeatedly minimizing nonconvex multivariate Taylor-based polynomial sub-problems, at least locally. Finding efficient techniques for the solution of these sub-problems, beyond the second-order case, has been an open question.
In this talk, we discuss various algorithmic framework that we designed to for efficiently minimizing a nonconvex Taylor-based regularized polynomial sub-problems for beyond the second order. These include the Quadratic Quartic Regularization (QQR) method, which approximates the third-order tensor term using a linear combination of quadratic and quartic terms. Additionally, we discuss the Cubic Quartic Regularization (CQR) method, which incorporates both cubic and quartic terms. We also explore the complexity of the Sums-of-Squares (SoS) Taylor method, which uses a convex SoS Taylor model that can be tractably minimized. We test these algorithms numerically and observe some variants to be competitive with ARC and other subproblem solvers on typical instances and even superior on ill-conditioned subproblems with special structure.