论文标题

在知识图实体上预先训练的复发神经网络的半监督URL分割

Semi-supervised URL Segmentation with Recurrent Neural Networks Pre-trained on Knowledge Graph Entities

论文作者

Zhang, Hao, Ro, Jae, Sproat, Richard

论文摘要

打开域名(例如OpenResearch)将诸如OpenResearch ers组成词打开,研究对于诸如文本到语音综合和Web搜索之类的应用很重要。我们将此问题与中文单词分割的经典问题联系起来,并使用字符作为输入来显示基于复发性神经网络(RNN)的标记模型的有效性。为了弥补缺乏培训数据,我们建议对大型知识数据库中的串联实体名称进行预训练方法。预训练将模型提高了33%,并将序列精度提高到85%。

Breaking domain names such as openresearch into component words open and research is important for applications like Text-to-Speech synthesis and web search. We link this problem to the classic problem of Chinese word segmentation and show the effectiveness of a tagging model based on Recurrent Neural Networks (RNNs) using characters as input. To compensate for the lack of training data, we propose a pre-training method on concatenated entity names in a large knowledge database. Pre-training improves the model by 33% and brings the sequence accuracy to 85%.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源