论文标题

Shadownet:卷积神经网络的安全有效的在设备上的推理系统

ShadowNet: A Secure and Efficient On-device Model Inference System for Convolutional Neural Networks

论文作者

Sun, Zhichuang, Sun, Ruimin, Liu, Changming, Chowdhury, Amrita Roy, Lu, Long, Jha, Somesh

论文摘要

随着AI加速器在移动和边缘设备上的使用增加,机上机器学习(ML)正在越来越受欢迎。如今,数以千计的专有ML模型正在用于数十亿个不受信任的设备上。这引起了人们对模型隐私的严重担忧。但是,保护模型隐私而不会失去对不信任的AI加速器的访问是一个挑战性的问题。在本文中,我们提出了一种新颖的设备上推理系统Shadownet。 Shadownet使用可信赖的执行环境(TEE)保护模型隐私,同时将模型的重型线性层牢固地外包给不受信任的硬件加速器。 Shadownet通过在将线性层的重量转换为外包并恢复T恤内的结果之前就可以实现这一目标。非线性层也保持在发球区内。 Shadownet的设计可确保重量的有效转换和随后的结果恢复。我们基于Tensorflow Lite构建了Shadownet原型,并在五个流行的CNN上进行了评估,即Mobilenet,Resnet-44,Minivgg,Resnet-404和Yolov4-Tiny。我们的评估表明,Shadownet通过合理的性能实现了强大的安全保证,为安全的设备上模型推断提供了实用的解决方案。

With the increased usage of AI accelerators on mobile and edge devices, on-device machine learning (ML) is gaining popularity. Thousands of proprietary ML models are being deployed today on billions of untrusted devices. This raises serious security concerns about model privacy. However, protecting model privacy without losing access to the untrusted AI accelerators is a challenging problem. In this paper, we present a novel on-device model inference system, ShadowNet. ShadowNet protects the model privacy with Trusted Execution Environment (TEE) while securely outsourcing the heavy linear layers of the model to the untrusted hardware accelerators. ShadowNet achieves this by transforming the weights of the linear layers before outsourcing them and restoring the results inside the TEE. The non-linear layers are also kept secure inside the TEE. ShadowNet's design ensures efficient transformation of the weights and the subsequent restoration of the results. We build a ShadowNet prototype based on TensorFlow Lite and evaluate it on five popular CNNs, namely, MobileNet, ResNet-44, MiniVGG, ResNet-404, and YOLOv4-tiny. Our evaluation shows that ShadowNet achieves strong security guarantees with reasonable performance, offering a practical solution for secure on-device model inference.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源