论文标题

使用键入依赖项的增强的Tree-LSTM架构用于句子语义建模

An enhanced Tree-LSTM architecture for sentence semantic modeling using typed dependencies

论文作者

Kleenankandy, Jeena, Nazeer, K. A. Abdul

论文摘要

基于树的长期记忆(LSTM)网络已成为建模语言文本含义的最新技术,因为它们可以有效利用语法语法,从而在句子的单词中进行非线性依赖性。但是,这些模型中的大多数无法识别由单词或短语的语义角色变化引起的含义的差异,因为它们不承认语法关系的类型,也称为典型依赖性,在句子结构中。本文提出了一个增强的LSTM架构,称为关系门控LSTM,可以使用控件输入对序列的两个输入之间的关系进行建模。我们还介绍了一个称为键入依赖性树LSTM的Tree-LSTM模型,该模型使用句子依赖解析结构以及依赖类型将句子含义嵌入到密集的向量中。提出的模型在两个典型的NLP任务中优于其类型 - unaware对应物 - 语义相关性评分和情感分析,较少的训练时期。结果与其他最先进的模型具有可比性或竞争性。定性分析表明,句子之声的变化对模型的预测分数几乎没有影响,而名义(名词)单词的变化产生了更大的影响。该模型在句子对中识别出微妙的语义关系。嵌入的学习打字依赖项的幅度也与人类直觉一致。研究结果暗示了语法关系在句子建模中的重要性。拟议的模型将成为朝这个方向进行未来研究的基础。

Tree-based Long short term memory (LSTM) network has become state-of-the-art for modeling the meaning of language texts as they can effectively exploit the grammatical syntax and thereby non-linear dependencies among words of the sentence. However, most of these models cannot recognize the difference in meaning caused by a change in semantic roles of words or phrases because they do not acknowledge the type of grammatical relations, also known as typed dependencies, in sentence structure. This paper proposes an enhanced LSTM architecture, called relation gated LSTM, which can model the relationship between two inputs of a sequence using a control input. We also introduce a Tree-LSTM model called Typed Dependency Tree-LSTM that uses the sentence dependency parse structure as well as the dependency type to embed sentence meaning into a dense vector. The proposed model outperformed its type-unaware counterpart in two typical NLP tasks - Semantic Relatedness Scoring and Sentiment Analysis, in a lesser number of training epochs. The results were comparable or competitive with other state-of-the-art models. Qualitative analysis showed that changes in the voice of sentences had little effect on the model's predicted scores, while changes in nominal (noun) words had a more significant impact. The model recognized subtle semantic relationships in sentence pairs. The magnitudes of learned typed dependencies embeddings were also in agreement with human intuitions. The research findings imply the significance of grammatical relations in sentence modeling. The proposed models would serve as a base for future researches in this direction.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源