论文标题
授权下一代可信计算的数据中心
Empowering Data Centers for Next Generation Trusted Computing
论文作者
论文摘要
现代数据中心已经超越了CPU节点,以向客户提供特定领域的加速器,例如GPU和FPGA。从安全角度来看,云客户希望保护其数据。他们愿意为受信任的执行环境(例如Intel SGX和AMD Sev提供的飞地)支付额外费用。不幸的是,客户必须做出关键选择 - 使用特定于域的加速器来速度或使用基于CPU的机密计算解决方案。为了弥合这一差距,我们旨在启用跨CPU和加速器扩展的数据中心规模机密计算。我们认为,为加速器提供广泛的TEE-PUPPORT提供了一种更容易的解决方案,但与现实相去甚远。取而代之的是,我们的混合设计为在多个CPU节点和设备上分布的计算提供了代码执行保证,并提供了/不带TEE支持的设备。我们的解决方案在两个维度上优雅地缩放 - 它可以处理大量异质节点,并且可以在将来可用时可容纳TEE设备。我们观察到的边际间接费用为$ 0.42 $ - $ 8 \%$ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ - $ 8 \%$在数据中心中独立于节点的数量。我们为两个加速器(AI和存储)添加自定义TEE支持,并将其集成到我们的解决方案中,从而表明它可以迎合未来的TEE设备。
Modern data centers have grown beyond CPU nodes to provide domain-specific accelerators such as GPUs and FPGAs to their customers. From a security standpoint, cloud customers want to protect their data. They are willing to pay additional costs for trusted execution environments such as enclaves provided by Intel SGX and AMD SEV. Unfortunately, the customers have to make a critical choice -- either use domain-specific accelerators for speed or use CPU-based confidential computing solutions. To bridge this gap, we aim to enable data-center scale confidential computing that expands across CPUs and accelerators. We argue that having wide-scale TEE-support for accelerators presents a technically easier solution, but is far away from being a reality. Instead, our hybrid design provides enclaved execution guarantees for computation distributed over multiple CPU nodes and devices with/without TEE support. Our solution scales gracefully in two dimensions -- it can handle a large number of heterogeneous nodes and it can accommodate TEE-enabled devices as and when they are available in the future. We observe marginal overheads of $0.42$--$8\%$ on real-world AI data center workloads that are independent of the number of nodes in the data center. We add custom TEE support to two accelerators (AI and storage) and integrate it into our solution, thus demonstrating that it can cater to future TEE devices.