Next Article in Journal
Identify Risk Pattern of E-Bike Riders in China Based on Machine Learning Framework
Previous Article in Journal
Polynomial and Wavelet-Type Transfer Function Models to Improve Fisheries’ Landing Forecasting with Exogenous Variables
Previous Article in Special Issue
A New Surrogating Algorithm by the Complex Graph Fourier Transform (CGFT)
Open AccessArticle

Embedding Learning with Triple Trustiness on Noisy Knowledge Graph

Financial Intelligence and Financial Engineering Key Laboratory of Sichuan Province, School of Economic Information Engineering, Southwestern University of Finance and Economics, Chengdu 611130, China
Laboratoire d’Informatique de Paris 6 (LIP6), Universit Pierre et Marie Curie, 75252 Paris, France
Authors to whom correspondence should be addressed.
Entropy 2019, 21(11), 1083;
Received: 22 October 2019 / Revised: 3 November 2019 / Accepted: 5 November 2019 / Published: 6 November 2019
(This article belongs to the Special Issue Information Theory and Graph Signal Processing)
Embedding learning on knowledge graphs (KGs) aims to encode all entities and relationships into a continuous vector space, which provides an effective and flexible method to implement downstream knowledge-driven artificial intelligence (AI) and natural language processing (NLP) tasks. Since KG construction usually involves automatic mechanisms with less human supervision, it inevitably brings in plenty of noises to KGs. However, most conventional KG embedding approaches inappropriately assume that all facts in existing KGs are completely correct and ignore noise issues, which brings about potentially serious errors. To address this issue, in this paper we propose a novel approach to learn embeddings with triple trustiness on KGs, which takes possible noises into consideration. Specifically, we calculate the trustiness value of triples according to the rich and relatively reliable information from large amounts of entity type instances and entity descriptions in KGs. In addition, we present a cross-entropy based loss function for model optimization. In experiments, we evaluate our models on KG noise detection, KG completion and classification. Through extensive experiments on three datasets, we demonstrate that our proposed model can learn better embeddings than all baselines on noisy KGs. View Full-Text
Keywords: knowledge graph; embedding learning; cross entropy; noise detection; triple trustiness knowledge graph; embedding learning; cross entropy; noise detection; triple trustiness
Show Figures

Figure 1

MDPI and ACS Style

Zhao, Y.; Feng, H.; Gallinari, P. Embedding Learning with Triple Trustiness on Noisy Knowledge Graph. Entropy 2019, 21, 1083.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

Back to TopTop