CSpace
A Minibatch Proximal Stochastic Recursive Gradient Algorithm Using a Trust-Region-Like Scheme and Barzilai-Borwein Stepsizes
Yu, Tengteng1; Liu, Xin-Wei2; Dai, Yu-Hong3,4; Sun, Jie5,6
2021-10-01
发表期刊IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
ISSN2162-237X
卷号32期号:10页码:4627-4638
摘要We consider the problem of minimizing the sum of an average of a large number of smooth convex component functions and a possibly nonsmooth convex function that admits a simple proximal mapping. This class of problems arises frequently in machine learning, known as regularized empirical risk minimization (ERM). In this article, we propose mSRGTR-BB, a minibatch proximal stochastic recursive gradient algorithm, which employs a trust-region-like scheme to select stepsizes that are automatically computed by the Barzilai-Borwein method. We prove that mSRGTR-BB converges linearly in expectation for strongly and nonstrongly convex objective functions. With proper parameters, mSRGTR-BB enjoys a faster convergence rate than the state-of-the-art minibatch proximal variant of the semistochastic gradient method (mS2GD). Numerical experiments on standard data sets show that the performance of mSRGTR-BB is comparable to and sometimes even better than mS2GD with best-tuned stepsizes and is superior to some modern proximal stochastic gradient methods.
关键词Convergence Convex functions Risk management Gradient methods Learning systems Sun Barzilai-Borwein (BB) method empirical risk minimization (ERM) proximal method stochastic gradient trust-region
DOI10.1109/TNNLS.2020.3025383
收录类别SCI
语种英语
资助项目Chinese NSF[11671116] ; Chinese NSF[11701137] ; Chinese NSF[12071108] ; Chinese NSF[11631013] ; Chinese NSF[11991020] ; Chinese NSF[12021001] ; Major Research Plan of the NSFC[91630202] ; Beijing Academy of Artificial Intelligence (BAAI)
WOS研究方向Computer Science ; Engineering
WOS类目Computer Science, Artificial Intelligence ; Computer Science, Hardware & Architecture ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic
WOS记录号WOS:000704111000031
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
引用统计
文献类型期刊论文
条目标识符http://ir.amss.ac.cn/handle/2S8OKBNM/59405
专题中国科学院数学与系统科学研究院
通讯作者Liu, Xin-Wei
作者单位1.Hebei Univ Technol, Sch Artificial Intelligence, Tianjin 300401, Peoples R China
2.Hebei Univ Technol, Inst Math, Tianjin 300401, Peoples R China
3.Chinese Acad Sci, Acad Math & Syst Sci, State Key Lab Sci & Engn Comp LSEC, Beijing 100190, Peoples R China
4.Univ Chinese Acad Sci, Sch Math Sci, Beijing 100049, Peoples R China
5.Hebei Univ Technol, Inst Math, Tianjin 300401, Peoples R China
6.Natl Univ Singapore, Sch Business, Singapore 119245, Singapore
推荐引用方式
GB/T 7714
Yu, Tengteng,Liu, Xin-Wei,Dai, Yu-Hong,et al. A Minibatch Proximal Stochastic Recursive Gradient Algorithm Using a Trust-Region-Like Scheme and Barzilai-Borwein Stepsizes[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2021,32(10):4627-4638.
APA Yu, Tengteng,Liu, Xin-Wei,Dai, Yu-Hong,&Sun, Jie.(2021).A Minibatch Proximal Stochastic Recursive Gradient Algorithm Using a Trust-Region-Like Scheme and Barzilai-Borwein Stepsizes.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,32(10),4627-4638.
MLA Yu, Tengteng,et al."A Minibatch Proximal Stochastic Recursive Gradient Algorithm Using a Trust-Region-Like Scheme and Barzilai-Borwein Stepsizes".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 32.10(2021):4627-4638.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Yu, Tengteng]的文章
[Liu, Xin-Wei]的文章
[Dai, Yu-Hong]的文章
百度学术
百度学术中相似的文章
[Yu, Tengteng]的文章
[Liu, Xin-Wei]的文章
[Dai, Yu-Hong]的文章
必应学术
必应学术中相似的文章
[Yu, Tengteng]的文章
[Liu, Xin-Wei]的文章
[Dai, Yu-Hong]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。