论文标题
双神经网络回归
Twin Neural Network Regression
论文作者
论文摘要
我们介绍了双神经网络(TNN)回归。该方法可以预测两个不同数据点的目标值而不是目标本身之间的差异。然后,通过在看不见的数据点和所有训练数据点之间的所有预测差异的集合中取平均值来获得传统回归问题的解决方案。尽管合奏通常成本高昂,但TNN回归本质上会创造出训练集大小的两倍的预测,同时仅训练单个神经网络。由于已证明合奏比单个模型更准确,因此该属性自然传输到TNN回归。我们表明,与其他最新方法相比,TNN能够为不同的数据集竞争或产生更准确的预测。此外,TNN回归受自稳态条件的限制。我们发现违反这些条件为预测不确定性提供了估计。
We introduce twin neural network (TNN) regression. This method predicts differences between the target values of two different data points rather than the targets themselves. The solution of a traditional regression problem is then obtained by averaging over an ensemble of all predicted differences between the targets of an unseen data point and all training data points. Whereas ensembles are normally costly to produce, TNN regression intrinsically creates an ensemble of predictions of twice the size of the training set while only training a single neural network. Since ensembles have been shown to be more accurate than single models this property naturally transfers to TNN regression. We show that TNNs are able to compete or yield more accurate predictions for different data sets, compared to other state-of-the-art methods. Furthermore, TNN regression is constrained by self-consistency conditions. We find that the violation of these conditions provides an estimate for the prediction uncertainty.