CSpace  > 应用数学研究所
Learning graph attention-aware knowledge graph embedding
Li, Chen1; Peng, Xutan6; Niu, Yuhang1,2; Zhang, Shanghang3; Peng, Hao1,5; Zhou, Chuan4; Li, Jianxin1
2021-10-21
发表期刊NEUROCOMPUTING
ISSN0925-2312
卷号461页码:516-529
摘要The knowledge graph, which utilizes graph structure to represent multi-relational data, has been widely used in the reasoning and prediction tasks, attracting considerable research efforts recently. However, most existing works still concentrate on learning knowledge graph embeddings straightforwardly and intuitively without subtly considering the context of knowledge. Specifically, recent models deal with each single triple independently or consider contexts indiscriminately, which is one-sided as each knowledge unit (i.e., triple) can be derived from its partial surrounding triples. In this paper, we propose a graph-attention-based model to encode entities, which formulates a knowledge graph as an irregular graph and explores a number of concrete and interpretable knowledge compositions by integrating the graph-structured information via multiple independent channels. To measure the correlation between entities from different angles (i.e., entity pair, relation, and structure), we respectively develop three attention metrics. By making use of our enhanced entity embeddings, we further introduce several improved factorization functions for updating relation embeddings and evaluating candidate triples. We conduct extensive experiments on downstream tasks including entity classification, entity typing, and link prediction to validate our methods. Empirical results validate the importance of our introduced attention metrics and demonstrate that our proposed method can improve the performance of factorization models on large-scale knowledge graphs. (c) 2021 Elsevier B.V. All rights reserved.
关键词Knowledge graph embedding Graph attention mechanism Entity typing Link prediction
DOI10.1016/j.neucom.2021.01.139
收录类别SCI
语种英语
资助项目Natural Science Foundation of China program[U20B2053] ; Natural Science Foundation of China program[62002007] ; Natural Science Foundation of China program[61772151] ; S&T Program of Hebei[20310101D] ; S&T Program of Hebei[SKLSDE-2020ZX-12]
WOS研究方向Computer Science
WOS类目Computer Science, Artificial Intelligence
WOS记录号WOS:000704086300003
出版者ELSEVIER
引用统计
文献类型期刊论文
条目标识符http://ir.amss.ac.cn/handle/2S8OKBNM/59348
专题应用数学研究所
通讯作者Li, Chen
作者单位1.Beihang Univ, Beijing Adv Innovat Ctr Big Data & Brain Comp, Beijing, Peoples R China
2.Beihang Univ, State Key Lab Software Dev Environm Lab, Beijing, Peoples R China
3.Carnegie Mellon Univ, Dept Elect & Comp Engn, Pittsburgh, PA USA
4.Chinese Acad Sci, Acad Math & Syst Sci, Beijing, Peoples R China
5.Beihang Univ, Sch Cyber Sci & Technol, Beijing, Peoples R China
6.Univ Sheffield, Dept Comp Sci, Sheffield, S Yorkshire, England
推荐引用方式
GB/T 7714
Li, Chen,Peng, Xutan,Niu, Yuhang,et al. Learning graph attention-aware knowledge graph embedding[J]. NEUROCOMPUTING,2021,461:516-529.
APA Li, Chen.,Peng, Xutan.,Niu, Yuhang.,Zhang, Shanghang.,Peng, Hao.,...&Li, Jianxin.(2021).Learning graph attention-aware knowledge graph embedding.NEUROCOMPUTING,461,516-529.
MLA Li, Chen,et al."Learning graph attention-aware knowledge graph embedding".NEUROCOMPUTING 461(2021):516-529.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Li, Chen]的文章
[Peng, Xutan]的文章
[Niu, Yuhang]的文章
百度学术
百度学术中相似的文章
[Li, Chen]的文章
[Peng, Xutan]的文章
[Niu, Yuhang]的文章
必应学术
必应学术中相似的文章
[Li, Chen]的文章
[Peng, Xutan]的文章
[Niu, Yuhang]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。