CSpace
Distributed stochastic gradient tracking methods with momentum acceleration for non-convex optimization
Gao, Juan1; Liu, Xin-Wei2; Dai, Yu-Hong3,4; Huang, Yakui2; Gu, Junhua1
2022-11-23
Source PublicationCOMPUTATIONAL OPTIMIZATION AND APPLICATIONS
ISSN0926-6003
Pages42
AbstractWe consider a distributed non-convex optimization problem of minimizing the sum of all local cost functions over a network of agents. This problem often appears in large-scale distributed machine learning, known as non-convex empirical risk minimization. In this paper, we propose two accelerated algorithms, named DSGT-HB and DSGT-NAG, which combine the distributed stochastic gradient tracking (DSGT) method with momentum accelerated techniques. Under appropriate assumptions, we prove that both algorithms sublinearly converge to a neighborhood of a first-order stationary point of the distributed non-convex optimization. Moreover, we derive the conditions under which DSGT-HB and DSGT-NAG achieve a network-independent linear speedup. Numerical experiments for a distributed non-convex logistic regression problem on real data sets and a deep neural network on the MNIST database show the superiorities of DSGT-HB and DSGT-NAG compared with DSGT.
KeywordDistributed non-convex optimization Machine learning Momentum methods Optimization algorithms Convergence rate
DOI10.1007/s10589-022-00432-5
Indexed BySCI
Language英语
Funding ProjectNational Natural Science Foundation of China[12071108] ; National Natural Science Foundation of China[11671116] ; National Natural Science Foundation of China[12021001] ; National Natural Science Foundation of China[11991021] ; National Natural Science Foundation of China[11991020] ; National Natural Science Foundation of China[11971372] ; National Natural Science Foundation of China[11701137] ; Strategic Priority Research Program of CAS[XDA27000000]
WOS Research AreaOperations Research & Management Science ; Mathematics
WOS SubjectOperations Research & Management Science ; Mathematics, Applied
WOS IDWOS:000886800900001
PublisherSPRINGER
Citation statistics
Document Type期刊论文
Identifierhttp://ir.amss.ac.cn/handle/2S8OKBNM/60593
Collection中国科学院数学与系统科学研究院
Corresponding AuthorLiu, Xin-Wei
Affiliation1.Hebei Univ Technol, Sch Artificial Intelligence, Tianjin 300401, Peoples R China
2.Hebei Univ Technol, Inst Math, Tianjin 300401, Peoples R China
3.Chinese Acad Sci, Acad Math & Syst Sci, ICMSEC, LSEC, Beijing 100190, Peoples R China
4.Univ Chinese Acad Sci, Sch Math Sci, Beijing 100049, Peoples R China
Recommended Citation
GB/T 7714
Gao, Juan,Liu, Xin-Wei,Dai, Yu-Hong,et al. Distributed stochastic gradient tracking methods with momentum acceleration for non-convex optimization[J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS,2022:42.
APA Gao, Juan,Liu, Xin-Wei,Dai, Yu-Hong,Huang, Yakui,&Gu, Junhua.(2022).Distributed stochastic gradient tracking methods with momentum acceleration for non-convex optimization.COMPUTATIONAL OPTIMIZATION AND APPLICATIONS,42.
MLA Gao, Juan,et al."Distributed stochastic gradient tracking methods with momentum acceleration for non-convex optimization".COMPUTATIONAL OPTIMIZATION AND APPLICATIONS (2022):42.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Gao, Juan]'s Articles
[Liu, Xin-Wei]'s Articles
[Dai, Yu-Hong]'s Articles
Baidu academic
Similar articles in Baidu academic
[Gao, Juan]'s Articles
[Liu, Xin-Wei]'s Articles
[Dai, Yu-Hong]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Gao, Juan]'s Articles
[Liu, Xin-Wei]'s Articles
[Dai, Yu-Hong]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.