arrow
Volume 43, Issue 5
An Accelerated Stochastic Trust Region Method for Stochastic Optimization

Rulei Qi, Dan Xue, Jing Li & Yujia Zhai

J. Comp. Math., 43 (2025), pp. 1169-1193.

Published online: 2025-09

Export citation
  • Abstract

In this paper, we propose an accelerated stochastic variance reduction gradient method with a trust-region-like framework, referred as the NMSVRG-TR method. Based on NMSVRG, we incorporate a Katyusha-like acceleration step into the stochastic trust region scheme, which improves the convergence rate of the SVRG methods. Under appropriate assumptions, the linear convergence of the algorithm is provided for strongly convex objective functions. Numerical experiment results show that our algorithm is generally superior to some existing stochastic gradient methods.

  • AMS Subject Headings

65K05, 90C15

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{JCM-43-1169, author = {Qi , RuleiXue , DanLi , Jing and Zhai , Yujia}, title = {An Accelerated Stochastic Trust Region Method for Stochastic Optimization}, journal = {Journal of Computational Mathematics}, year = {2025}, volume = {43}, number = {5}, pages = {1169--1193}, abstract = {

In this paper, we propose an accelerated stochastic variance reduction gradient method with a trust-region-like framework, referred as the NMSVRG-TR method. Based on NMSVRG, we incorporate a Katyusha-like acceleration step into the stochastic trust region scheme, which improves the convergence rate of the SVRG methods. Under appropriate assumptions, the linear convergence of the algorithm is provided for strongly convex objective functions. Numerical experiment results show that our algorithm is generally superior to some existing stochastic gradient methods.

}, issn = {1991-7139}, doi = {https://doi.org/10.4208/jcm.2504-m2023-0228}, url = {http://global-sci.org/intro/article_detail/jcm/24476.html} }
TY - JOUR T1 - An Accelerated Stochastic Trust Region Method for Stochastic Optimization AU - Qi , Rulei AU - Xue , Dan AU - Li , Jing AU - Zhai , Yujia JO - Journal of Computational Mathematics VL - 5 SP - 1169 EP - 1193 PY - 2025 DA - 2025/09 SN - 43 DO - http://doi.org/10.4208/jcm.2504-m2023-0228 UR - https://global-sci.org/intro/article_detail/jcm/24476.html KW - Stochastic optimization, Stochastic variance reduced gradient, Trust region, Gradient descent method, Machine learning. AB -

In this paper, we propose an accelerated stochastic variance reduction gradient method with a trust-region-like framework, referred as the NMSVRG-TR method. Based on NMSVRG, we incorporate a Katyusha-like acceleration step into the stochastic trust region scheme, which improves the convergence rate of the SVRG methods. Under appropriate assumptions, the linear convergence of the algorithm is provided for strongly convex objective functions. Numerical experiment results show that our algorithm is generally superior to some existing stochastic gradient methods.

Qi , RuleiXue , DanLi , Jing and Zhai , Yujia. (2025). An Accelerated Stochastic Trust Region Method for Stochastic Optimization. Journal of Computational Mathematics. 43 (5). 1169-1193. doi:10.4208/jcm.2504-m2023-0228
Copy to clipboard
The citation has been copied to your clipboard