论文标题

使用GNNS的高级综合性能预测:基准测试和前进

High-Level Synthesis Performance Prediction using GNNs: Benchmarking, Modeling, and Advancing

论文作者

Wu, Nan, Yang, Hang, Xie, Yuan, Li, Pan, Hao, Cong

论文摘要

敏捷硬件开发需要早期设计阶段快速准确的电路质量评估。高级合成(HLS)性能预测的现有工作通常需要在综合过程后进行广泛的功能工程。为了加快从早期的设计阶段进行电路评估,我们提出了快速准确的性能建模,通过将C/C ++程序表示为图形,从而利用图神经网络(GNN)的表示能力。这项工作的贡献是三倍。首先,我们构建一个包含40K C合成程序的标准基准测试,其中包括合成程序和三组实际HLS基准。每个程序都在FPGA上实施,以生成基础真相性能指标。其次,我们在图表上正式提出了HLS性能预测问题,并提出了使用GNN的多个建模策略,这些策略利用了预测及时性(早期/晚期预测)和准确性之间的不同权衡。第三,我们进一步提出了一种新型的分层GNN,该GNN不牺牲及时性,而是在很大程度上提高了预测准确性,从而极大地表现了HLS工具。我们对合成和看不见的实际案例计划进行广泛的评估;我们提出的预测指标在很大程度上优于HLS高达40倍,而在资源使用和时机预测方面,现有预测因子高达2倍至5倍。

Agile hardware development requires fast and accurate circuit quality evaluation from early design stages. Existing work of high-level synthesis (HLS) performance prediction usually needs extensive feature engineering after the synthesis process. To expedite circuit evaluation from as earlier design stage as possible, we propose a rapid and accurate performance modeling, exploiting the representation power of graph neural networks (GNNs) by representing C/C++ programs as graphs. The contribution of this work is three-fold. First, we build a standard benchmark containing 40k C synthesizable programs, which includes both synthetic programs and three sets of real-world HLS benchmarks. Each program is implemented on FPGA to generate ground-truth performance metrics. Second, we formally formulate the HLS performance prediction problem on graphs, and propose multiple modeling strategies with GNNs that leverage different trade-offs between prediction timeliness (early/late prediction) and accuracy. Third, we further propose a novel hierarchical GNN that does not sacrifice timeliness but largely improves prediction accuracy, significantly outperforming HLS tools. We apply extensive evaluations for both synthetic and unseen real-case programs; our proposed predictor largely outperforms HLS by up to 40X and excels existing predictors by 2X to 5X in terms of resource usage and timing prediction.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源