CSpace  > 应用数学研究所
Elastic Information Bottleneck
Ni, Yuyan1; Lan, Yanyan2; Liu, Ao3; Ma, Zhiming1
2022-09-01
Source PublicationMATHEMATICS
Volume10Issue:18Pages:26
AbstractInformation bottleneck is an information-theoretic principle of representation learning that aims to learn a maximally compressed representation that preserves as much information about labels as possible. Under this principle, two different methods have been proposed, i.e., information bottleneck (IB) and deterministic information bottleneck (DIB), and have gained significant progress in explaining the representation mechanisms of deep learning algorithms. However, these theoretical and empirical successes are only valid with the assumption that training and test data are drawn from the same distribution, which is clearly not satisfied in many real-world applications. In this paper, we study their generalization abilities within a transfer learning scenario, where the target error could be decomposed into three components, i.e., source empirical error, source generalization gap (SG), and representation discrepancy (RD). Comparing IB and DIB on these terms, we prove that DIB's SG bound is tighter than IB's while DIB's RD is larger than IB's. Therefore, it is difficult to tell which one is better. To balance the trade-off between SG and the RD, we propose an elastic information bottleneck (EIB) to interpolate between the IB and DIB regularizers, which guarantees a Pareto frontier within the IB framework. Additionally, simulations and real data experiments show that EIB has the ability to achieve better domain adaptation results than IB and DIB, which validates the correctness of our theories.
Keywordinformation bottleneck transfer learning generalization bound
DOI10.3390/math10183352
Indexed BySCI
Language英语
Funding ProjectNational Key R&D Program of China[2021YFF1201600] ; Vanke Special Fund for Public Health and Health Discipline Development, Tsinghua University[2022-1080053] ; Beijing Academy of Artificial Intelligence (BAAI)
WOS Research AreaMathematics
WOS SubjectMathematics
WOS IDWOS:000859604800001
PublisherMDPI
Citation statistics
Document Type期刊论文
Identifierhttp://ir.amss.ac.cn/handle/2S8OKBNM/60939
Collection应用数学研究所
Corresponding AuthorLan, Yanyan
Affiliation1.Chinese Acad Sci, Acad Math & Syst Sci, Beijing 100190, Peoples R China
2.Tsinghua Univ, Inst AI Ind Res, Beijing 100084, Peoples R China
3.Univ Chinese Acad Sci, Sch Comp Sci & Technol, Beijing 100049, Peoples R China
Recommended Citation
GB/T 7714
Ni, Yuyan,Lan, Yanyan,Liu, Ao,et al. Elastic Information Bottleneck[J]. MATHEMATICS,2022,10(18):26.
APA Ni, Yuyan,Lan, Yanyan,Liu, Ao,&Ma, Zhiming.(2022).Elastic Information Bottleneck.MATHEMATICS,10(18),26.
MLA Ni, Yuyan,et al."Elastic Information Bottleneck".MATHEMATICS 10.18(2022):26.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Ni, Yuyan]'s Articles
[Lan, Yanyan]'s Articles
[Liu, Ao]'s Articles
Baidu academic
Similar articles in Baidu academic
[Ni, Yuyan]'s Articles
[Lan, Yanyan]'s Articles
[Liu, Ao]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Ni, Yuyan]'s Articles
[Lan, Yanyan]'s Articles
[Liu, Ao]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.