论文标题
使用长期短期记忆神经网络进行系统识别的深度转移学习
Deep transfer learning for system identification using long short-term memory neural networks
论文作者
论文摘要
复发性神经网络(RNN)比传统的系统识别技术具有许多优势。它们可以应用于线性和非线性系统,并且需要更少的建模假设。但是,这些神经网络模型也可能需要大量的数据来学习和推广。此外,神经网络培训是一个耗时的过程。因此,在长短期限内存神经网络(LSTM)的基础上,本文建议使用两种类型的深层传输学习,即参数微调和冻结,以减少系统识别的数据和计算要求。我们应用这些技术来识别两个动态系统,即二阶线性系统和Wiener-Hammerstein非线性系统。结果表明,与直接学习相比,我们的方法将学习加速10%至50%,这也节省了数据和计算资源。
Recurrent neural networks (RNNs) have many advantages over more traditional system identification techniques. They may be applied to linear and nonlinear systems, and they require fewer modeling assumptions. However, these neural network models may also need larger amounts of data to learn and generalize. Furthermore, neural networks training is a time-consuming process. Hence, building upon long-short term memory neural networks (LSTM), this paper proposes using two types of deep transfer learning, namely parameter fine-tuning and freezing, to reduce the data and computation requirements for system identification. We apply these techniques to identify two dynamical systems, namely a second-order linear system and a Wiener-Hammerstein nonlinear system. Results show that compared with direct learning, our method accelerates learning by 10% to 50%, which also saves data and computing resources.