TitleInductively Representing Out-of-Knowledge-Graph Entities by Optimal Estimation Under Translational Assumptions
AuthorsDai, Damai
Zheng, Hua
Luo, Fuli
Yang, Pengcheng
Chang, Baobao
Sui, Zhifang
AffiliationPeking Univ, Key Lab Computat Linguist MOE, Beijing, Peoples R China
Peng Cheng Lab, Shenzhen, Peoples R China
Issue Date2021
PublisherREPL4NLP 2021: PROCEEDINGS OF THE 6TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP
AbstractConventional Knowledge Graph Completion (KGC) assumes that all test entities appear during training. However, in real-world scenarios, Knowledge Graphs (KG) evolve fast with out-of-knowledge-graph (OOKG) entities added frequently, and we need to efficiently represent these entities. Most existing Knowledge Graph Embedding (KGE) methods cannot represent OOKG entities without costly retraining on the whole KG. To enhance efficiency, we propose a simple and effective method that inductively represents OOKG entities by their optimal estimation under translational assumptions. Moreover, given pretrained embeddings of the in-knowledge-graph (IKG) entities, our method even needs no additional learning. Experimental results on two KGC tasks with OOKG entities show that our method outperforms the previous methods by a large margin with higher efficiency.
URIhttp://hdl.handle.net/20.500.11897/624537
ISBN978-1-954085-72-5
IndexedCPCI-SSH(ISSHP)
CPCI-S(ISTP)
Appears in Collections:计算语言学教育部重点实验室

Files in This Work
There are no files associated with this item.

Web of Science®



Checked on Last Week

百度学术™



Checked on Current Time




License: See PKU IR operational policies.