CSpace  > 应用数学研究所
One-Shot Neural Architecture Search: Maximising Diversity to Overcome Catastrophic Forgetting
Zhang, Miao1,2,3; Li, Huiqi1; Pan, Shirui2; Chang, Xiaojun2,4; Zhou, Chuan5; Ge, Zongyuan6; Su, Steven3
2021-09-01
Source PublicationIEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
ISSN0162-8828
Volume43Issue:9Pages:2921-2935
AbstractOne-shot neural architecture search (NAS) has recently become mainstream in the NAS community because it significantly improves computational efficiency through weight sharing. However, the supernet training paradigm in one-shot NAS introduces catastrophic forgetting, where each step of the training can deteriorate the performance of other architectures that contain partially-shared weights with current architecture. To overcome this problem of catastrophic forgetting, we formulate supernet training for one-shot NAS as a constrained continual learning optimization problem such that learning the current architecture does not degrade the validation accuracy of previous architectures. The key to solving this constrained optimization problem is a novelty search based architecture selection (NSAS) loss function that regularizes the supernet training by using a greedy novelty search method to find the most representative subset. We applied the NSAS loss function to two one-shot NAS baselines and extensively tested them on both a common search space and a NAS benchmark dataset. We further derive three variants based on the NSAS loss function, the NSAS with depth constrain (NSAS-C) to improve the transferability, and NSAS-G and NSAS-LG to handle the situation with a limited number of constraints. The experiments on the common NAS search space demonstrate that NSAS and it variants improve the predictive ability of supernet training in one-shot NAS with remarkable and efficient performance on the CIFAR-10, CIFAR-100, and ImageNet datasets. The results with the NAS benchmark dataset also confirm the significant improvements these one-shot NAS baselines can make.
KeywordComputer architecture Training Optimization Neural networks Search methods Australia Germanium AutoML neural architecture search continual learning catastrophic forgetting novelty search
DOI10.1109/TPAMI.2020.3035351
Indexed BySCI
Language英语
Funding ProjectNSFC[61702415] ; NSFC[61972315] ; Australian Research Council (ARC) under a Discovery Early Career Researcher Award (DECRA)[DE190100626] ; Air Force Research Laboratory, DARPA[FA8750-19-20501] ; Youth Innovation Promotion Association CAS[2017210]
WOS Research AreaComputer Science ; Engineering
WOS SubjectComputer Science, Artificial Intelligence ; Engineering, Electrical & Electronic
WOS IDWOS:000681124300008
PublisherIEEE COMPUTER SOC
Citation statistics
Document Type期刊论文
Identifierhttp://ir.amss.ac.cn/handle/2S8OKBNM/59033
Collection应用数学研究所
Corresponding AuthorLi, Huiqi; Pan, Shirui
Affiliation1.Beijing Inst Technol, Sch Informat & Elect, Beijing 100081, Peoples R China
2.Monash Univ, Fac Informat Technol, Clayton, Vic 3800, Australia
3.Univ Technol Sydney, Fac Engn & Informat Technol, Ultimo, NSW 2007, Australia
4.King Abdulaziz Univ, Fac Comp & Informat Technol, Jeddah 21589, Saudi Arabia
5.Chinese Acad Sci, Acad Math & Syst Sci, Beijing 100081, Peoples R China
6.Monash Univ, Monash E Res Ctr, Clayton, Vic 3800, Australia
Recommended Citation
GB/T 7714
Zhang, Miao,Li, Huiqi,Pan, Shirui,et al. One-Shot Neural Architecture Search: Maximising Diversity to Overcome Catastrophic Forgetting[J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE,2021,43(9):2921-2935.
APA Zhang, Miao.,Li, Huiqi.,Pan, Shirui.,Chang, Xiaojun.,Zhou, Chuan.,...&Su, Steven.(2021).One-Shot Neural Architecture Search: Maximising Diversity to Overcome Catastrophic Forgetting.IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE,43(9),2921-2935.
MLA Zhang, Miao,et al."One-Shot Neural Architecture Search: Maximising Diversity to Overcome Catastrophic Forgetting".IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 43.9(2021):2921-2935.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Zhang, Miao]'s Articles
[Li, Huiqi]'s Articles
[Pan, Shirui]'s Articles
Baidu academic
Similar articles in Baidu academic
[Zhang, Miao]'s Articles
[Li, Huiqi]'s Articles
[Pan, Shirui]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Zhang, Miao]'s Articles
[Li, Huiqi]'s Articles
[Pan, Shirui]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.