论文标题

“深入”介入$ b \ to c $异常:使用自我归一化的神经网络选择标准化且适合未来的模型

'Deep' Dive into $b \to c$ Anomalies: Standardized and Future-proof Model Selection Using Self-normalizing Neural Networks

论文作者

Bhattacharya, Srimoy, Nandi, Soumitra, Patra, Sunando Kumar, Sahoo, Shantanu

论文摘要

Noting the erroneous proclivity of information-theoretic approaches, like the Akaike information criterion (AIC), to select simpler models while performing model selection with a small sample size, we address the problem of new physics model selection in $b\to c τν_τ$ decays in this paper by employing a specific machine learning algorithm (self-normalizing neural networks, a.k.a. SNN) for supervised classification and regression, in a独立于模型的框架。虽然将实际数据集的分类结果与AIC进行了比较,但在模型选择的所有方面,SNN的表现都优于AIC $ _C $,但将回归结果与贝叶斯分析的结果进行了比较。获得的参数空间有很大差异,同时保持最大后验(MAP)估计值相似。发现一些具有张量的两种操作员方案被认为是数据的最可能的解决方案。我们还通过Belle-II中的预期,更精确的数据来测试训练有素的网络的有效性。提供了训练有素的网络和相关功能,以供社区使用。

Noting the erroneous proclivity of information-theoretic approaches, like the Akaike information criterion (AIC), to select simpler models while performing model selection with a small sample size, we address the problem of new physics model selection in $b\to c τν_τ$ decays in this paper by employing a specific machine learning algorithm (self-normalizing neural networks, a.k.a. SNN) for supervised classification and regression, in a model-independent framework. While the outcomes of the classification with real data-set are compared with AIC, with the SNNs outperforming AIC$_c$ in all aspects of model selection, the regression-outcomes are compared with the results from Bayesian analyses; the obtained parameter spaces differ considerably while keeping maximum posterior (MAP) estimates similar. A few of the two-operator scenarios with a tensor-type interaction are found to be the most probable solution for the data. We also test the effectiveness of our trained networks with the expected, more precise data in Belle-II. The trained networks and associated functionalities are supplied for the use of the community.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源