20
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Effective negative triplet sampling for knowledge graph embedding

, , , &
Pages 2075-2087 | Received 01 Sep 2022, Published online: 16 Dec 2022
 

Abstract

Knowledge graphs contain only positive triplet facts, whereas the negative triplets need to be generated precisely to train the embedding models. Early Uniform and Bernoulli sampling are applied but suffer’s from the zero loss problems during training, affecting the performance of embedding models. Recently, generative adversarial technic attended the dynamic negative sampling and obtained better performance by vanishing zero loss but on the adverse side of increasing the model complexity and training parameter. However, NSCaching balances the performance and complexity, generating a single negative triplet sample for each positive triplet that focuses on vanishing gradients. This paper addressed the zero loss training problem due to the low-scored negative triplet by proposing the extended version of NSCaching, to generate the high-scored negative triplet utilized to increase the training performance. The proposed method experimented with semantic matching knowledge graph embedding models on the benchmark datasets, where the results show the success on all evaluation metrics.

Subject Classification:

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.