id author title date pages extension mime words sentences flesch summary cache txt cord-104001-5clslvqb Wang, Xiaoqi selfRL: Two-Level Self-Supervised Transformer Representation Learning for Link Prediction of Heterogeneous Biomedical Networks 2020-10-21 .txt text/plain 5522 292 49 The meta path detection-based self-supervised learning task is proposed to learn representation vectors that can capture the global-level structure and semantic feature in HBNs. The vertex entity mask-based self-supervised learning mechanism is designed to enhance local association of vertices. First, a meta path detection self-supervised learning mechanism is developed to train a deep Transformer encoder for learning low-dimensional representations that capture the path-level information on HBNs. Meanwhile, sel-fRL integrates the vertex entity mask task to learn local association of vertices in HBNs. Finally, the representations from the entity mask and meta path detection is concatenated for generating the embedding vectors of nodes in HBNs. The results of link prediction on six datasets show that the proposed selfRL is superior to 25 state-of-the-art methods. • We proposed a two-level self-supervised representation learning method for HBNs, where this study integrates the meta path detection and vertex entity mask selfsupervised learning task based on a great number of unlabeled data to learn high quality representation vector of vertices. ./cache/cord-104001-5clslvqb.txt ./txt/cord-104001-5clslvqb.txt