首页 - 学术活动This talk presents a solver-aware perspective on large-scale nonconvex optimization, coupling algorithmic design with matrix- and tensor-free subproblem solvers. First, we discuss second-order methods with full-curvature awareness. By detecting and handling nonpositive curvature on the fly within Krylov-subspace iterations for solving the Newton system, we establish line search methods with global complexity guarantees and superlinear local convergence under standard regularity conditions. Moving to third-order and beyond (arbitrary order p), we present practical and theoretical results for adaptively regularized tensor methods (ARp). We introduce improved strategies including efficient regularization updates and a novel pre-rejection mechanism. Furthermore, we extend the local convergence result of Doikov and Nesterov [Math. Program., 193 (2022), pp. 315-336] for tensor methods and establish a sharp local pth-order convergence rate for ARp, contingent on the right choice of local subproblem minimizer.