CSpace
Distributed stochastic gradient tracking methods with momentum acceleration for non-convex optimization
Gao, Juan1; Liu, Xin-Wei2; Dai, Yu-Hong3,4; Huang, Yakui2; Gu, Junhua1
2022-11-23
发表期刊COMPUTATIONAL OPTIMIZATION AND APPLICATIONS
ISSN0926-6003
页码42
摘要We consider a distributed non-convex optimization problem of minimizing the sum of all local cost functions over a network of agents. This problem often appears in large-scale distributed machine learning, known as non-convex empirical risk minimization. In this paper, we propose two accelerated algorithms, named DSGT-HB and DSGT-NAG, which combine the distributed stochastic gradient tracking (DSGT) method with momentum accelerated techniques. Under appropriate assumptions, we prove that both algorithms sublinearly converge to a neighborhood of a first-order stationary point of the distributed non-convex optimization. Moreover, we derive the conditions under which DSGT-HB and DSGT-NAG achieve a network-independent linear speedup. Numerical experiments for a distributed non-convex logistic regression problem on real data sets and a deep neural network on the MNIST database show the superiorities of DSGT-HB and DSGT-NAG compared with DSGT.
关键词Distributed non-convex optimization Machine learning Momentum methods Optimization algorithms Convergence rate
DOI10.1007/s10589-022-00432-5
收录类别SCI
语种英语
资助项目National Natural Science Foundation of China[12071108] ; National Natural Science Foundation of China[11671116] ; National Natural Science Foundation of China[12021001] ; National Natural Science Foundation of China[11991021] ; National Natural Science Foundation of China[11991020] ; National Natural Science Foundation of China[11971372] ; National Natural Science Foundation of China[11701137] ; Strategic Priority Research Program of CAS[XDA27000000]
WOS研究方向Operations Research & Management Science ; Mathematics
WOS类目Operations Research & Management Science ; Mathematics, Applied
WOS记录号WOS:000886800900001
出版者SPRINGER
引用统计
文献类型期刊论文
条目标识符http://ir.amss.ac.cn/handle/2S8OKBNM/60593
专题中国科学院数学与系统科学研究院
通讯作者Liu, Xin-Wei
作者单位1.Hebei Univ Technol, Sch Artificial Intelligence, Tianjin 300401, Peoples R China
2.Hebei Univ Technol, Inst Math, Tianjin 300401, Peoples R China
3.Chinese Acad Sci, Acad Math & Syst Sci, ICMSEC, LSEC, Beijing 100190, Peoples R China
4.Univ Chinese Acad Sci, Sch Math Sci, Beijing 100049, Peoples R China
推荐引用方式
GB/T 7714
Gao, Juan,Liu, Xin-Wei,Dai, Yu-Hong,et al. Distributed stochastic gradient tracking methods with momentum acceleration for non-convex optimization[J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS,2022:42.
APA Gao, Juan,Liu, Xin-Wei,Dai, Yu-Hong,Huang, Yakui,&Gu, Junhua.(2022).Distributed stochastic gradient tracking methods with momentum acceleration for non-convex optimization.COMPUTATIONAL OPTIMIZATION AND APPLICATIONS,42.
MLA Gao, Juan,et al."Distributed stochastic gradient tracking methods with momentum acceleration for non-convex optimization".COMPUTATIONAL OPTIMIZATION AND APPLICATIONS (2022):42.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Gao, Juan]的文章
[Liu, Xin-Wei]的文章
[Dai, Yu-Hong]的文章
百度学术
百度学术中相似的文章
[Gao, Juan]的文章
[Liu, Xin-Wei]的文章
[Dai, Yu-Hong]的文章
必应学术
必应学术中相似的文章
[Gao, Juan]的文章
[Liu, Xin-Wei]的文章
[Dai, Yu-Hong]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。