论文标题

LPGNET:链接用于节点分类的专用图形网络

LPGNet: Link Private Graph Networks for Node Classification

论文作者

Kolluri, Aashish, Baluta, Teodora, Hooi, Bryan, Saxena, Prateek

论文摘要

标记为图形结构数据的分类任务具有许多重要的应用程序,从社交建议到财务建模。深度神经网络越来越多地用于图形上的节点分类,其中具有相似特征的节点必须给出相同的标签。图形卷积网络(GCN)是一个经过广泛研究的神经网络体系结构,在此任务上表现良好。但是,对GCN的强大链接攻击攻击最近表明,即使对训练有素的模型进行黑框访问,培训图中存在哪些链接(或边缘)也很实际。在本文中,我们提出了一种名为LPGNET的新神经网络体系结构,用于对具有隐私敏感的边缘的图表进行培训。 LPGNET使用新颖的设计为训练过程中如何使用图形结构提供了差异隐私(DP)保证。我们从经验上表明,LPGNET模型通常位于提供隐私和效用之间的最佳位置:它们比使用不使用边缘信息的“琐碎”私人体系结构(例如,香草MLP)和对现有的链接驱动攻击更好的弹性比使用完整边缘结构的香草GCN更好的弹性。与DPGCN相比,LPGNET还提供了一贯更好的隐私性权衡权衡,这是我们在大多数评估的数据集中将差异隐私改造为常规GCN的最新机制。

Classification tasks on labeled graph-structured data have many important applications ranging from social recommendation to financial modeling. Deep neural networks are increasingly being used for node classification on graphs, wherein nodes with similar features have to be given the same label. Graph convolutional networks (GCNs) are one such widely studied neural network architecture that perform well on this task. However, powerful link-stealing attacks on GCNs have recently shown that even with black-box access to the trained model, inferring which links (or edges) are present in the training graph is practical. In this paper, we present a new neural network architecture called LPGNet for training on graphs with privacy-sensitive edges. LPGNet provides differential privacy (DP) guarantees for edges using a novel design for how graph edge structure is used during training. We empirically show that LPGNet models often lie in the sweet spot between providing privacy and utility: They can offer better utility than "trivially" private architectures which use no edge information (e.g., vanilla MLPs) and better resilience against existing link-stealing attacks than vanilla GCNs which use the full edge structure. LPGNet also offers consistently better privacy-utility tradeoffs than DPGCN, which is the state-of-the-art mechanism for retrofitting differential privacy into conventional GCNs, in most of our evaluated datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源