KMS Of Academy of mathematics and systems sciences, CAS
Learning graph attention-aware knowledge graph embedding | |
Li, Chen1; Peng, Xutan6; Niu, Yuhang1,2; Zhang, Shanghang3; Peng, Hao1,5; Zhou, Chuan4![]() | |
2021-10-21 | |
Source Publication | NEUROCOMPUTING
![]() |
ISSN | 0925-2312 |
Volume | 461Pages:516-529 |
Abstract | The knowledge graph, which utilizes graph structure to represent multi-relational data, has been widely used in the reasoning and prediction tasks, attracting considerable research efforts recently. However, most existing works still concentrate on learning knowledge graph embeddings straightforwardly and intuitively without subtly considering the context of knowledge. Specifically, recent models deal with each single triple independently or consider contexts indiscriminately, which is one-sided as each knowledge unit (i.e., triple) can be derived from its partial surrounding triples. In this paper, we propose a graph-attention-based model to encode entities, which formulates a knowledge graph as an irregular graph and explores a number of concrete and interpretable knowledge compositions by integrating the graph-structured information via multiple independent channels. To measure the correlation between entities from different angles (i.e., entity pair, relation, and structure), we respectively develop three attention metrics. By making use of our enhanced entity embeddings, we further introduce several improved factorization functions for updating relation embeddings and evaluating candidate triples. We conduct extensive experiments on downstream tasks including entity classification, entity typing, and link prediction to validate our methods. Empirical results validate the importance of our introduced attention metrics and demonstrate that our proposed method can improve the performance of factorization models on large-scale knowledge graphs. (c) 2021 Elsevier B.V. All rights reserved. |
Keyword | Knowledge graph embedding Graph attention mechanism Entity typing Link prediction |
DOI | 10.1016/j.neucom.2021.01.139 |
Indexed By | SCI |
Language | 英语 |
Funding Project | Natural Science Foundation of China program[U20B2053] ; Natural Science Foundation of China program[62002007] ; Natural Science Foundation of China program[61772151] ; S&T Program of Hebei[20310101D] ; S&T Program of Hebei[SKLSDE-2020ZX-12] |
WOS Research Area | Computer Science |
WOS Subject | Computer Science, Artificial Intelligence |
WOS ID | WOS:000704086300003 |
Publisher | ELSEVIER |
Citation statistics | |
Document Type | 期刊论文 |
Identifier | http://ir.amss.ac.cn/handle/2S8OKBNM/59348 |
Collection | 应用数学研究所 |
Corresponding Author | Li, Chen |
Affiliation | 1.Beihang Univ, Beijing Adv Innovat Ctr Big Data & Brain Comp, Beijing, Peoples R China 2.Beihang Univ, State Key Lab Software Dev Environm Lab, Beijing, Peoples R China 3.Carnegie Mellon Univ, Dept Elect & Comp Engn, Pittsburgh, PA USA 4.Chinese Acad Sci, Acad Math & Syst Sci, Beijing, Peoples R China 5.Beihang Univ, Sch Cyber Sci & Technol, Beijing, Peoples R China 6.Univ Sheffield, Dept Comp Sci, Sheffield, S Yorkshire, England |
Recommended Citation GB/T 7714 | Li, Chen,Peng, Xutan,Niu, Yuhang,et al. Learning graph attention-aware knowledge graph embedding[J]. NEUROCOMPUTING,2021,461:516-529. |
APA | Li, Chen.,Peng, Xutan.,Niu, Yuhang.,Zhang, Shanghang.,Peng, Hao.,...&Li, Jianxin.(2021).Learning graph attention-aware knowledge graph embedding.NEUROCOMPUTING,461,516-529. |
MLA | Li, Chen,et al."Learning graph attention-aware knowledge graph embedding".NEUROCOMPUTING 461(2021):516-529. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment