Network Embedding via a Bi-Mode and Deep Neural Network Model
AbstractNetwork embedding (NE) is an important method to learn the representations of a network via a low-dimensional space. Conventional NE models focus on capturing the structural information and semantic information of vertices while neglecting such information for edges. In this work, we propose a novel NE model named BimoNet to capture both the structural and semantic information of edges. BimoNet is composed of two parts; i.e., the bi-mode embedding part and the deep neural network part. For the bi-mode embedding part, the first mode—named the add-mode—is used to express the entity-shared features of edges and the second mode—named the subtract-mode—is employed to represent the entity-specific features of edges. These features actually reflect the semantic information. For the deep neural network part, we firstly regard the edges in a network as nodes, and the vertices as links, which will not change the overall structure of the whole network. Then, we take the nodes’ adjacent matrix as the input of the deep neural network, as it can obtain similar representations for nodes with similar structure. Afterwards, by jointly optimizing the objective function of these two parts, BimoNet could preserve both the semantic and structural information of edges. In experiments, we evaluate BimoNet on three real-world datasets and the task of relation extraction, and BimoNet is demonstrated to outperform state-of-the-art baseline models consistently. View Full-Text
Share & Cite This Article
Fang, Y.; Zhao, X.; Tan, Z.; Xiao, W. Network Embedding via a Bi-Mode and Deep Neural Network Model. Symmetry 2018, 10, 180.
Fang Y, Zhao X, Tan Z, Xiao W. Network Embedding via a Bi-Mode and Deep Neural Network Model. Symmetry. 2018; 10(5):180.Chicago/Turabian Style
Fang, Yang; Zhao, Xiang; Tan, Zhen; Xiao, Weidong. 2018. "Network Embedding via a Bi-Mode and Deep Neural Network Model." Symmetry 10, no. 5: 180.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.