论文标题
字符串理论和深层生成模型中的统计预测
Statistical Predictions in String Theory and Deep Generative Models
论文作者
论文摘要
深度学习中的生成模型允许对近似数据分布的采样概率分布。我们建议使用生成模型在字符串理论格局中进行近似统计预测。对于Vacua承认Lagrangian描述,这可以被认为是学习耦合的随机张量近似。作为一种具体的原理证明,我们在一大批Calabi-yau歧管中证明了在卡勒·莫杜利(Kahler Moduli)空间的点上评估的卡勒指标,这是由深卷积的瓦斯特尔·沃斯特(Wasserstein gan)产生的矩阵的合奏充分利用的。 Kahler Metric征收的准确近似值少于$ h^{11} $ Gaussian抽签。通过有条件的gan实现了训练集以外的$ h^{11} $值的准确外推。总之,这些结果隐含地暗示了数据中存在强相关性,这是Reid的幻想是正确的。
Generative models in deep learning allow for sampling probability distributions that approximate data distributions. We propose using generative models for making approximate statistical predictions in the string theory landscape. For vacua admitting a Lagrangian description this can be thought of as learning random tensor approximations of couplings. As a concrete proof-of-principle, we demonstrate in a large ensemble of Calabi-Yau manifolds that Kahler metrics evaluated at points in Kahler moduli space are well-approximated by ensembles of matrices produced by a deep convolutional Wasserstein GAN. Accurate approximations of the Kahler metric eigenspectra are achieved with far fewer than $h^{11}$ Gaussian draws. Accurate extrapolation to values of $h^{11}$ outside the training set are achieved via a conditional GAN. Together, these results implicitly suggest the existence of strong correlations in the data, as might be expected if Reid's fantasy is correct.