CSpace
Proximal-Like Incremental Aggregated Gradient Method with Linear Convergence Under Bregman Distance Growth Conditions
Zhang, Hui1; Dai, Yu-Hong2; Guo, Lei3; Peng, Wei1
2021-02-01
发表期刊MATHEMATICS OF OPERATIONS RESEARCH
ISSN0364-765X
卷号46期号:1页码:61-81
摘要We introduce a unified algorithmic framework, called the proximal-like incremental aggregated gradient (PLIAG) method, for minimizing the sum of a convex function that consists of additive relatively smooth convex components and a proper lower semicontinuous convex regularization function over an abstract feasible set whose geometry can be captured by using the domain of a Legendre function. The PLIAG method includes many existing algorithms in the literature as special cases, such as the proximal gradient method, the Bregman proximal gradient method (also called the NoLips algorithm), the incremental aggregated gradient method, the incremental aggregated proximal method, and the proximal incremental aggregated gradient method. It also includes some novel interesting iteration schemes. First, we show that the PLIAG method is globally sublinearly convergent without requiring a growth condition, which extends the sublinear convergence result for the proximal gradient algorithm to incremental aggregated-type first-order methods. Then, by embedding a so-called Bregman distance growth condition into a descent-type lemma to construct a special Lyapunov function, we show that the PLIAG method is globally linearly convergent in terms of both function values and Bregman distances to the optimal solution set, provided that the step size is not greater than some positive constant. The convergence results derived in this paper are all established beyond the standard assumptions in the literature (i.e., without requiring the strong convexity and the Lipschitz gradient continuity of the smooth part of the objective). When specialized to many existing algorithms, our results recover or supplement their convergence results under strictly weaker conditions.
关键词incremental aggregated gradient linear convergence Lipschitz-like/convexity relative smoothness Bregman distance growth
DOI10.1287/moor.2019.1047
收录类别SCI
语种英语
资助项目National Science Foundation of China[61601488] ; National Science Foundation of China[11401379] ; National Science Foundation of China[11631013] ; National Science Foundation of China[11971480] ; National Science Foundation of China[11826204] ; National Science Foundation of China[11771287] ; National Science Foundation of China[71632007] ; Key Project of the Chinese National Programs for Fundamental Research and Development[2015CB856002] ; Fundamental Research Funds for the Central Universities
WOS研究方向Operations Research & Management Science ; Mathematics
WOS类目Operations Research & Management Science ; Mathematics, Applied
WOS记录号WOS:000615980400003
出版者INFORMS
引用统计
文献类型期刊论文
条目标识符http://ir.amss.ac.cn/handle/2S8OKBNM/58228
专题中国科学院数学与系统科学研究院
通讯作者Guo, Lei
作者单位1.Natl Univ Def Technol, Dept Math, Changsha 410073, Peoples R China
2.Chinese Acad Sci, Acad Math & Syst Sci, Inst Computat Math & Sci Engn Comp, State Key Lab Sci & Engn Comp, Beijing 100190, Peoples R China
3.East China Univ Sci & Technol, Sch Business, Shanghai 200237, Peoples R China
推荐引用方式
GB/T 7714
Zhang, Hui,Dai, Yu-Hong,Guo, Lei,et al. Proximal-Like Incremental Aggregated Gradient Method with Linear Convergence Under Bregman Distance Growth Conditions[J]. MATHEMATICS OF OPERATIONS RESEARCH,2021,46(1):61-81.
APA Zhang, Hui,Dai, Yu-Hong,Guo, Lei,&Peng, Wei.(2021).Proximal-Like Incremental Aggregated Gradient Method with Linear Convergence Under Bregman Distance Growth Conditions.MATHEMATICS OF OPERATIONS RESEARCH,46(1),61-81.
MLA Zhang, Hui,et al."Proximal-Like Incremental Aggregated Gradient Method with Linear Convergence Under Bregman Distance Growth Conditions".MATHEMATICS OF OPERATIONS RESEARCH 46.1(2021):61-81.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Zhang, Hui]的文章
[Dai, Yu-Hong]的文章
[Guo, Lei]的文章
百度学术
百度学术中相似的文章
[Zhang, Hui]的文章
[Dai, Yu-Hong]的文章
[Guo, Lei]的文章
必应学术
必应学术中相似的文章
[Zhang, Hui]的文章
[Dai, Yu-Hong]的文章
[Guo, Lei]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。