Abstract
Inductive reasoning in knowledge graphs excels at predicting relationships between previously unseen entities, a task that transductive methods often fail to address effectively. Most current inductive approaches to relationship prediction in knowledge graphs depend on closed subgraphs for inference. However, these methods face significant limitations, as an overdependence on closed subgraphs can result in the omission of valuable neighbourhood relational information, especially in sparse subgraphs. This limitation hinders the effective utilisation of available information. To address this issue, we propose a novel method that incorporates adjacent relationships – those directly linked to the target triple – and extracts first-order adjacent relationship subgraphs centred on the target triple. The extracted information, which includes the target triple and its direct contextual relationships from both the head and tail entities, is then input into a pre-trained BERT model for further fine-tuning. Our empirical findings indicate that the approach surpasses other state-of-the-art models when evaluated on the FB15K-237 and WN18RR datasets. Specifically, compared with the SiaILP hybrid model introduced in 2024, our BERTNR model shows an average improvement of 3.9% in the AUC-PR metric and 9.63% in the Hits@10 metric across both datasets. These results underscore the effectiveness of integrating neighbourhood relational learning with BERT for inductive relationship prediction, especially for sparse graphs.
Get full access to this article
View all access options for this article.
