arrow
Volume 43, Issue 5
Mini-Batch Stochastic Conjugate Gradient Algorithms with Minimal Variance

Caixia Kou, Feifei Gao & Yu-Hong Dai

J. Comp. Math., 43 (2025), pp. 1045-1062.

Published online: 2025-09

Export citation
  • Abstract

Stochastic gradient descent (SGD) methods have gained widespread popularity for solving large-scale optimization problems. However, the inherent variance in SGD often leads to slow convergence rates. We introduce a family of unbiased stochastic gradient estimators that encompasses existing estimators from the literature and identify a gradient estimator that not only maintains unbiasedness but also achieves minimal variance. Compared with the existing estimator used in SGD algorithms, the proposed estimator demonstrates a significant reduction in variance. By utilizing this stochastic gradient estimator to approximate the full gradient, we propose two mini-batch stochastic conjugate gradient algorithms with minimal variance. Under the assumptions of strong convexity and smoothness on the objective function, we prove that the two algorithms achieve linear convergence rates. Numerical experiments validate the effectiveness of the proposed gradient estimator in reducing variance and demonstrate that the two stochastic conjugate gradient algorithms exhibit accelerated convergence rates and enhanced stability.

  • AMS Subject Headings

49M37, 90C25

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{JCM-43-1045, author = {Kou , CaixiaGao , Feifei and Dai , Yu-Hong}, title = {Mini-Batch Stochastic Conjugate Gradient Algorithms with Minimal Variance}, journal = {Journal of Computational Mathematics}, year = {2025}, volume = {43}, number = {5}, pages = {1045--1062}, abstract = {

Stochastic gradient descent (SGD) methods have gained widespread popularity for solving large-scale optimization problems. However, the inherent variance in SGD often leads to slow convergence rates. We introduce a family of unbiased stochastic gradient estimators that encompasses existing estimators from the literature and identify a gradient estimator that not only maintains unbiasedness but also achieves minimal variance. Compared with the existing estimator used in SGD algorithms, the proposed estimator demonstrates a significant reduction in variance. By utilizing this stochastic gradient estimator to approximate the full gradient, we propose two mini-batch stochastic conjugate gradient algorithms with minimal variance. Under the assumptions of strong convexity and smoothness on the objective function, we prove that the two algorithms achieve linear convergence rates. Numerical experiments validate the effectiveness of the proposed gradient estimator in reducing variance and demonstrate that the two stochastic conjugate gradient algorithms exhibit accelerated convergence rates and enhanced stability.

}, issn = {1991-7139}, doi = {https://doi.org/10.4208/jcm.2505-m2025-0004}, url = {http://global-sci.org/intro/article_detail/jcm/24470.html} }
TY - JOUR T1 - Mini-Batch Stochastic Conjugate Gradient Algorithms with Minimal Variance AU - Kou , Caixia AU - Gao , Feifei AU - Dai , Yu-Hong JO - Journal of Computational Mathematics VL - 5 SP - 1045 EP - 1062 PY - 2025 DA - 2025/09 SN - 43 DO - http://doi.org/10.4208/jcm.2505-m2025-0004 UR - https://global-sci.org/intro/article_detail/jcm/24470.html KW - Stochastic gradient descent, Minimal variance, Stochastic conjugate gradient, Stochastic gradient estimator. AB -

Stochastic gradient descent (SGD) methods have gained widespread popularity for solving large-scale optimization problems. However, the inherent variance in SGD often leads to slow convergence rates. We introduce a family of unbiased stochastic gradient estimators that encompasses existing estimators from the literature and identify a gradient estimator that not only maintains unbiasedness but also achieves minimal variance. Compared with the existing estimator used in SGD algorithms, the proposed estimator demonstrates a significant reduction in variance. By utilizing this stochastic gradient estimator to approximate the full gradient, we propose two mini-batch stochastic conjugate gradient algorithms with minimal variance. Under the assumptions of strong convexity and smoothness on the objective function, we prove that the two algorithms achieve linear convergence rates. Numerical experiments validate the effectiveness of the proposed gradient estimator in reducing variance and demonstrate that the two stochastic conjugate gradient algorithms exhibit accelerated convergence rates and enhanced stability.

Kou , CaixiaGao , Feifei and Dai , Yu-Hong. (2025). Mini-Batch Stochastic Conjugate Gradient Algorithms with Minimal Variance. Journal of Computational Mathematics. 43 (5). 1045-1062. doi:10.4208/jcm.2505-m2025-0004
Copy to clipboard
The citation has been copied to your clipboard