TitleAn Asymptotic Analysis of Random Partition Based Minibatch Momentum Methods for Linear Regression Models
AuthorsGao, Yuan
Zhu, Xuening
Qi, Haobo
Li, Guodong
Zhang, Riquan
Wang, Hansheng
AffiliationEast China Normal Univ, Sch Stat, Shanghai, Peoples R China
East China Normal Univ, KLATASDS MOE, Shanghai, Peoples R China
Fudan Univ, Sch Data Sci, Shanghai, Peoples R China
Peking Univ, Guanghua Sch Management, Beijing, Peoples R China
Univ Hong Kong, Dept Stat & Actuarial Sci, Hong Kong, Peoples R China
Shanghai Univ Int Business & Econ, Sch Stat & Informat, Shanghai, Peoples R China
Issue DateDec-2022
AbstractMomentum methods have been shown to accelerate the convergence of the standard gradient descent algorithm in practice and theory. In particular, the random partition based minibatch gradient descent methods with momentum (MGDM) are widely used to solve large-scale optimization problems with massive datasets. Despite the great popularity of the MGDM methods in practice, their theoretical properties are still underexplored. To this end, we investigate the theoretical properties of MGDM methods based on the linear regression models. We first study the numerical convergence properties of the MGDM algorithm and derive the conditions for faster numerical convergence rate. In addition, we explore the relationship between the statistical properties of the resulting MGDM estimator and the tuning parameters. Based on these theoretical findings, we give the conditions for the resulting estimator to achieve the optimal statistical efficiency. Finally, extensive numerical experiments are conducted to verify our theoretical results. for this article are available online.
Appears in Collections:光华管理学院

Files in This Work
There are no files associated with this item.

Web of Science®

Checked on Last Week


Checked on Current Time


Checked on Current Time

Google Scholar™

License: See PKU IR operational policies.