What makes for uniformity for non-contrastive self-supervised learning?
Wang YinQuan2,3; Zhang XiaoPeng4; Tian Qi4; Lu JinHu1
刊名SCIENCE CHINA-TECHNOLOGICAL SCIENCES
2022-09-16
页码10
关键词contrastive learning self-supervised learning representation uniformity dynamics
ISSN号1674-7321
DOI10.1007/s11431-021-2041-7
英文摘要Recent advances in self-supervised learning (SSL) have made remarkable progress, especially for contrastive methods that target pulling two augmented views of one image together and pushing the views of all other images away. In this setting, negative pairs play a key role in avoiding collapsed representation. Recent studies, such as those on bootstrap your own latent (BYOL) and SimSiam, have surprisingly achieved a comparable performance even without contrasting negative samples. However, a basic theoretical issue for SSL arises: how can different SSL methods avoid collapsed representation, and is there a common design principle? In this study, we look deep into current non-contrastive SSL methods and analyze the key factors that avoid collapses. To achieve this goal, we present a new indicator of uniformity metric and study the local dynamics of the indicator to diagnose collapses in different scenarios. Moreover, we present some principles for choosing a good predictor, such that we can explicitly control the optimization process. Our theoretical analysis result is validated on some widely used benchmarks spanning different-scale datasets. We also compare recent SSL methods and analyze their commonalities in avoiding collapses and some ideas for future algorithm designs.
资助项目National Natural Science Foundation of China[61621003]
WOS研究方向Engineering ; Materials Science
语种英语
出版者SCIENCE PRESS
WOS记录号WOS:000855997800001
内容类型期刊论文
源URL[http://ir.amss.ac.cn/handle/2S8OKBNM/61028]  
专题中国科学院数学与系统科学研究院
通讯作者Zhang XiaoPeng
作者单位1.Beihang Univ, Sch Automat Sci & Elect Engn, State Key Lab Software Dev Environm, Beijing 100191, Peoples R China
2.Acad Math & Syst Sci, Chinese Acad Sci, Key Lab Syst & Control, Beijing 100190, Peoples R China
3.Univ Chinese Acad Sci, Sch Math Sci, Beijing 100049, Peoples R China
4.Huawei Inc, Shenzhen 518128, Peoples R China
推荐引用方式
GB/T 7714
Wang YinQuan,Zhang XiaoPeng,Tian Qi,et al. What makes for uniformity for non-contrastive self-supervised learning?[J]. SCIENCE CHINA-TECHNOLOGICAL SCIENCES,2022:10.
APA Wang YinQuan,Zhang XiaoPeng,Tian Qi,&Lu JinHu.(2022).What makes for uniformity for non-contrastive self-supervised learning?.SCIENCE CHINA-TECHNOLOGICAL SCIENCES,10.
MLA Wang YinQuan,et al."What makes for uniformity for non-contrastive self-supervised learning?".SCIENCE CHINA-TECHNOLOGICAL SCIENCES (2022):10.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace