Abstract
This paper delves into an innovative incrementalcubic regularized symmetric rank-1 (SR1) method (ICuREGSR1). By incorporating the cubic regularization technique intoSR1, we successfully address the issue of indefinite resultingmatrix in SR1. Our core strategy is to adopt an incrementaloptimization scheme, gradually updating the information of theobjective function, which typically involves a sum of multipleindependent functions, and is very common in large-scale machine learning tasks. Through numerical experiments on multiplemachine learning problems, we find that compared with othertraditional algorithms, our proposed algorithm exhibits superiorperformance in terms of gradient magnitude.Index Terms—quasi-Newton method, symmetric rank-1, superlinear convergence rate, cubic regularization, incrementaloptimization.
Incremental-Cubic-Regularized-SR1-Quasi-Newton-Method-and-the-Applications-in-Large-Scale-Problems
发表回复