CSpace
On the Asymptotic Convergence and Acceleration of Gradient Methods
Huang, Yakui1; Dai, Yu-Hong2,3; Liu, Xin-Wei1; Zhang, Hongchao4
2022
Source PublicationJOURNAL OF SCIENTIFIC COMPUTING
ISSN0885-7474
Volume90Issue:1Pages:29
AbstractWe consider the asymptotic behavior of a family of gradient methods, which include the steepest descent and minimal gradient methods as special instances. It is proved that each method in the family will asymptotically zigzag between two directions. Asymptotic convergence results of the objective value, gradient norm, and stepsize are presented as well. To accelerate the family of gradient methods, we further exploit spectral properties of stepsizes to break the zigzagging pattern. In particular, a new stepsize is derived by imposing finite termination on minimizing two-dimensional strictly convex quadratic function. It is shown that, for the general quadratic function, the proposed stepsize asymptotically converges to the reciprocal of the largest eigenvalue of the Hessian. Furthermore, based on this spectral property, we propose a periodic gradient method by incorporating the Barzilai-Borwein method. Numerical comparisons with some recent successful gradient methods show that our new method is very promising.
KeywordGradient methods Asymptotic convergence Spectral property Acceleration of gradient methods Barzilai-Borwein method Unconstrained optimization Quadratic optimization
DOI10.1007/s10915-021-01685-8
Indexed BySCI
Language英语
Funding ProjectNational Natural Science Foundation of China[11701137] ; National Natural Science Foundation of China[11631013] ; National Natural Science Foundation of China[12071108] ; National Natural Science Foundation of China[11671116] ; National Natural Science Foundation of China[11991021] ; National Natural Science Foundation of China[12021001] ; Strategic Priority Research Program of Chinese Academy of Sciences[XDA27000000] ; Beijing Academy of Artificial Intelligence (BAAI) ; Natural Science Foundation of Hebei Province[A2021202010] ; China Scholarship Council[201806705007] ; USA National Science Foundation[DMS-1819161] ; USA National Science Foundation[DMS-2110722]
WOS Research AreaMathematics
WOS SubjectMathematics, Applied
WOS IDWOS:000720653400002
PublisherSPRINGER/PLENUM PUBLISHERS
Citation statistics
Document Type期刊论文
Identifierhttp://ir.amss.ac.cn/handle/2S8OKBNM/59572
Collection中国科学院数学与系统科学研究院
Corresponding AuthorZhang, Hongchao
Affiliation1.Hebei Univ Technol, Inst Math, Tianjin 300401, Peoples R China
2.Chinese Acad Sci, Acad Math & Syst Sci, ICMSEC, LSEC, Beijing 100190, Peoples R China
3.Univ Chinese Acad Sci, Sch Math Sci, Beijing 100049, Peoples R China
4.Louisiana State Univ, Dept Math, Baton Rouge, LA 70803 USA
Recommended Citation
GB/T 7714
Huang, Yakui,Dai, Yu-Hong,Liu, Xin-Wei,et al. On the Asymptotic Convergence and Acceleration of Gradient Methods[J]. JOURNAL OF SCIENTIFIC COMPUTING,2022,90(1):29.
APA Huang, Yakui,Dai, Yu-Hong,Liu, Xin-Wei,&Zhang, Hongchao.(2022).On the Asymptotic Convergence and Acceleration of Gradient Methods.JOURNAL OF SCIENTIFIC COMPUTING,90(1),29.
MLA Huang, Yakui,et al."On the Asymptotic Convergence and Acceleration of Gradient Methods".JOURNAL OF SCIENTIFIC COMPUTING 90.1(2022):29.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Huang, Yakui]'s Articles
[Dai, Yu-Hong]'s Articles
[Liu, Xin-Wei]'s Articles
Baidu academic
Similar articles in Baidu academic
[Huang, Yakui]'s Articles
[Dai, Yu-Hong]'s Articles
[Liu, Xin-Wei]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Huang, Yakui]'s Articles
[Dai, Yu-Hong]'s Articles
[Liu, Xin-Wei]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.