A Survey on Knowledge Graph Embedding: Approaches, Applications and Benchmarks

A Survey on Knowledge Graph Embedding: Approaches, Applications and Benchmarks

Publication date: Mar 23, 2025

Traditional KG embedding models rely solely on the triplet structure to learn embeddings, but DKRL extends this by embedding the rich textual information found in entity descriptions. These models incorporate external data sources, such as textual descriptions of entities and relations or multi-hop relation paths, to enrich the embedding process and capture deeper semantic meanings. The general idea is that the embedding of the tail entity should be close to the embedding of the head entity plus the relation embedding. The LSTM captures the sequence and dependencies of relations, with the scoring function based on the similarity between the path embedding and the target relation embedding. NTN can capture complex semantic patterns but has a large number of parameters. MLP (Multi-Layer Perceptron) uses a simple feed-forward neural network to learn entity and relation embeddings. The path embeddings are obtained by combining the embeddings of individual relations in the path using a composition operator (addition, multiplication, or recurrent neural network (RNN)). In contrast, the CNN encoder applies convolutional filters over word embeddings to capture hierarchical patterns and contextual dependencies, resulting in richer and more informative entity embeddings.

Concepts Keywords
Cnns Based
Enriching Capture
Graphs Complex
Triplet Embedding
Embeddings
Entities
Entity
Graph
Models
Path
Relation
Relations
Semantic
Tensor
Triplet

Semantics

Type Source Name
disease MESH data sources
disease MESH uncertainty
drug DRUGBANK Tropicamide
drug DRUGBANK Flunarizine
drug DRUGBANK Guanosine
drug DRUGBANK Coenzyme M

Original Article

(Visited 4 times, 1 visits today)

Leave a Comment

Your email address will not be published. Required fields are marked *