Enabling Secure NVM-Based in-Memory Neural Network Computing by Sparse Fast Gradient Encryption
Cai, Yi1; Chen, Xiaoming2; Tian, Lu3; Wang, Yu1; Yang, Huazhong1
刊名IEEE TRANSACTIONS ON COMPUTERS
2020-11-01
卷号69期号:11页码:1596-1610
关键词Artificial neural networks Nonvolatile memory Encryption Computational modeling Hardware Non-volatile memory (NVM) compute-in-memory (CIM) neural network security encryption
ISSN号0018-9340
DOI10.1109/TC.2020.3017870
英文摘要Neural network (NN) computing is energy-consuming on traditional computing systems, owing to the inherent memory wall bottleneck of the von Neumann architecture and the Moore's Law being approaching the end. Non-volatile memories (NVMs) have been demonstrated as promising alternatives for constructing computing-in-memory (CIM) systems to accelerate NN computing. However, NVM-based NN computing systems are vulnerable to the confidentiality attacks because the weight parameters persist in memory when the system is powered off, enabling an adversary with physical access to extract the well-trained NN models. The goal of this article is to find a solution for thwarting the confidentiality attacks. We define and model the weight encryption problem. Then we propose an effective framework, containing a sparse fast gradient encryption (SFGE) method and a runtime encryption scheduling (RES) scheme, to guarantee the confidentiality security of NN models with a negligible performance overhead. Moreover, we improve the SFGE method by incrementally generating the encryption keys. Additionally, we provide variants of the encryption method to better fit quantized models and various mapping strategies. The experiments demonstrate that only encrypting an extremely small proportion of the weights (e.g., 20 weights per layer in ResNet-101), the NN models can be strictly protected.
资助项目National Key Research and Development Program of China[2018YFB0105000] ; National Key Research and Development Program of China[2017YFA0207600] ; National Natural Science Foundation of China[U19B2019] ; National Natural Science Foundation of China[61832007] ; National Natural Science Foundation of China[61621091] ; National Natural Science Foundation of China[61720106013] ; Beijing National Research Center for Information Science and Technology (BNRist) ; Beijing Innovation Center for Future Chips ; Beijing Academy of Artificial Intelligence (BAAI)
WOS研究方向Computer Science ; Engineering
语种英语
出版者IEEE COMPUTER SOC
WOS记录号WOS:000576255400004
内容类型期刊论文
源URL[http://119.78.100.204/handle/2XEOYT63/15662]  
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Wang, Yu
作者单位1.Tsinghua Univ, Beijing Natl Res Ctr Informat Sci & Technol BNRis, Dept Elect Engn, Beijing 100084, Peoples R China
2.Chinese Acad Sci, Inst Comp Technol, State Key Lab Comp Architecture, Beijing 100864, Peoples R China
3.Xilinx Inc, Beijing, Peoples R China
推荐引用方式
GB/T 7714
Cai, Yi,Chen, Xiaoming,Tian, Lu,et al. Enabling Secure NVM-Based in-Memory Neural Network Computing by Sparse Fast Gradient Encryption[J]. IEEE TRANSACTIONS ON COMPUTERS,2020,69(11):1596-1610.
APA Cai, Yi,Chen, Xiaoming,Tian, Lu,Wang, Yu,&Yang, Huazhong.(2020).Enabling Secure NVM-Based in-Memory Neural Network Computing by Sparse Fast Gradient Encryption.IEEE TRANSACTIONS ON COMPUTERS,69(11),1596-1610.
MLA Cai, Yi,et al."Enabling Secure NVM-Based in-Memory Neural Network Computing by Sparse Fast Gradient Encryption".IEEE TRANSACTIONS ON COMPUTERS 69.11(2020):1596-1610.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace