Research Article

Representation Learning of Knowledge Graphs with Embedding Subspaces

Algorithm 1

Embedding knowledge graphs into subspace.
Input: Training triples T = {(h, r, t)}, entities E, initial entity matrix WE, relations R, embedding subspace dimension s, dropout probability , candidate sampling rate , regularize parameter α
Output: subspace adaptive matrix Ws, relation embedding matrix WR, combination operators Deh, Drh, Det, Drt
Algorithm sub-ProjE (T, E, WE, R, s, , , α)
(1)Initializing adaptive matrix Ws, relation matrix WR, combination operators (diagonal matrices) Deh, Drh, Det, Drt with uniform distribution .
(2)Loop/A training iteration/epoch/
(3), , , ;//training data
(4)for do/construct training data using all train triples/
(5)
(6)if e = = h then/tail is missing/
(7)  
(8)  /all positive tails from T and some sampled negative candidates/
(9)else/head is missing/
(10)  
(11)  /all positive heads from T and some sampled negative candidates/
(12)end if
(13)end for
(14)for eachdo/minibatches/
(15)
(16)for each do/training instance/
(17)
(18)
(19)
(20)end for
(21)
(22)Update all parameters w.r.t
(23)end for
(24)EndLoop