Enabling Secure NVM-Based in-Memory Neural Network Computing by Sparse Fast Gradient Encryption | |
Cai, Yi1; Chen, Xiaoming2; Tian, Lu3; Wang, Yu1; Yang, Huazhong1 | |
刊名 | IEEE TRANSACTIONS ON COMPUTERS |
2020-11-01 | |
卷号 | 69期号:11页码:1596-1610 |
关键词 | Artificial neural networks Nonvolatile memory Encryption Computational modeling Hardware Non-volatile memory (NVM) compute-in-memory (CIM) neural network security encryption |
ISSN号 | 0018-9340 |
DOI | 10.1109/TC.2020.3017870 |
英文摘要 | Neural network (NN) computing is energy-consuming on traditional computing systems, owing to the inherent memory wall bottleneck of the von Neumann architecture and the Moore's Law being approaching the end. Non-volatile memories (NVMs) have been demonstrated as promising alternatives for constructing computing-in-memory (CIM) systems to accelerate NN computing. However, NVM-based NN computing systems are vulnerable to the confidentiality attacks because the weight parameters persist in memory when the system is powered off, enabling an adversary with physical access to extract the well-trained NN models. The goal of this article is to find a solution for thwarting the confidentiality attacks. We define and model the weight encryption problem. Then we propose an effective framework, containing a sparse fast gradient encryption (SFGE) method and a runtime encryption scheduling (RES) scheme, to guarantee the confidentiality security of NN models with a negligible performance overhead. Moreover, we improve the SFGE method by incrementally generating the encryption keys. Additionally, we provide variants of the encryption method to better fit quantized models and various mapping strategies. The experiments demonstrate that only encrypting an extremely small proportion of the weights (e.g., 20 weights per layer in ResNet-101), the NN models can be strictly protected. |
资助项目 | National Key Research and Development Program of China[2018YFB0105000] ; National Key Research and Development Program of China[2017YFA0207600] ; National Natural Science Foundation of China[U19B2019] ; National Natural Science Foundation of China[61832007] ; National Natural Science Foundation of China[61621091] ; National Natural Science Foundation of China[61720106013] ; Beijing National Research Center for Information Science and Technology (BNRist) ; Beijing Innovation Center for Future Chips ; Beijing Academy of Artificial Intelligence (BAAI) |
WOS研究方向 | Computer Science ; Engineering |
语种 | 英语 |
出版者 | IEEE COMPUTER SOC |
WOS记录号 | WOS:000576255400004 |
内容类型 | 期刊论文 |
源URL | [http://119.78.100.204/handle/2XEOYT63/15662] |
专题 | 中国科学院计算技术研究所期刊论文_英文 |
通讯作者 | Wang, Yu |
作者单位 | 1.Tsinghua Univ, Beijing Natl Res Ctr Informat Sci & Technol BNRis, Dept Elect Engn, Beijing 100084, Peoples R China 2.Chinese Acad Sci, Inst Comp Technol, State Key Lab Comp Architecture, Beijing 100864, Peoples R China 3.Xilinx Inc, Beijing, Peoples R China |
推荐引用方式 GB/T 7714 | Cai, Yi,Chen, Xiaoming,Tian, Lu,et al. Enabling Secure NVM-Based in-Memory Neural Network Computing by Sparse Fast Gradient Encryption[J]. IEEE TRANSACTIONS ON COMPUTERS,2020,69(11):1596-1610. |
APA | Cai, Yi,Chen, Xiaoming,Tian, Lu,Wang, Yu,&Yang, Huazhong.(2020).Enabling Secure NVM-Based in-Memory Neural Network Computing by Sparse Fast Gradient Encryption.IEEE TRANSACTIONS ON COMPUTERS,69(11),1596-1610. |
MLA | Cai, Yi,et al."Enabling Secure NVM-Based in-Memory Neural Network Computing by Sparse Fast Gradient Encryption".IEEE TRANSACTIONS ON COMPUTERS 69.11(2020):1596-1610. |
个性服务 |
查看访问统计 |
相关权益政策 |
暂无数据 |
收藏/分享 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论