Filters








7,184 Hits in 6.8 sec

Multiple Run Ensemble Learning with Low-Dimensional Knowledge Graph Embeddings [article]

Chengjin Xu, Mojtaba Nayyeri, Sahar Vahdati, Jens Lehmann
2021 arXiv   pre-print
Among the top approaches of recent years, link prediction using knowledge graph embedding (KGE) models has gained significant attention for knowledge graph completion.  ...  Experimental results on standard benchmark datasets, namely FB15K, FB15K-237 and WN18RR, show that multiple low-dimensional models of the same kind outperform the corresponding single high-dimensional  ...  Index Terms-Graph Embedding, Ensemble Learning, Statistical Relational Learning, Link Prediction I.  ... 
arXiv:2104.05003v2 fatcat:g7wpclhx2zanxfkbszjg2ryb7y

Graph Representation Ensemble Learning [article]

Palash Goyal, Di Huang, Sujit Rokka Chhetri, Arquimedes Canedo, Jaya Shree, Evan Patterson
2019 arXiv   pre-print
Most embedding methods aim to preserve certain properties of the original graph in the low dimensional space.  ...  In this work, we introduce the problem of graph representation ensemble learning and provide a first of its kind framework to aggregate multiple graph embedding methods efficiently.  ...  they aim to capture and define an objective function to learn these features in the low-dimensional embedding.  ... 
arXiv:1909.02811v2 fatcat:fmnffifkuzbrzb55ogkhod2nwm

Graph embedding ensemble methods based on the heterogeneous network for lncRNA-miRNA interaction prediction

Chengshuai Zhao, Yang Qiu, Shuang Zhou, Shichao Liu, Wen Zhang, Yanqing Niu
2020 BMC Genomics  
Results In this paper, we propose novel lncRNA-miRNA prediction methods by using graph embedding and ensemble learning.  ...  Second, we adopt several graph embedding methods to learn embedded representations of lncRNAs and miRNAs from the heterogeneous network, and construct the ensemble models using two ensemble strategies.  ...  Graph embedding learning (a.k.a. network representation learning), can be employed to preserve the structural property of the graph and map nodes of the graph into low-dimensional space, attracting widespread  ... 
doi:10.1186/s12864-020-07238-x pmid:33334307 fatcat:sghtkwpykbgjvdnbtmh2k5dzpe

Multi-Level Network Embedding with Boosted Low-Rank Matrix Approximation [article]

Jundong Li, Liang Wu, Huan Liu
2018 arXiv   pre-print
The learned low-dimensional node vector representation is generalizable and eases the knowledge discovery process on graphs by enabling various off-the-shelf machine learning tools to be directly applied  ...  The proposed BoostNE method is also in line with the successful gradient boosting method in ensemble learning as multiple weak embeddings lead to a stronger and more effective one.  ...  threshold is set as 0.05. • Compared with NetMF which learns node embedding within a single run of low-rank matrix approximation, the proposed BoostNE learns multiple embedding representations from coarse  ... 
arXiv:1808.08627v1 fatcat:2vtmmbcwwnbfzmqpxb6h66ibk4

Knowledge-Enriched Two-Layered Attention Network for Sentiment Analysis

Abhishek Kumar, Daisuke Kawahara, Sadao Kurohashi
2018 Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)  
The novel two-layered attention network takes advantage of the external knowledge bases to improve the sentiment prediction. It uses the Knowledge Graph Embedding generated using the Word-Net.  ...  We build our model by combining the two-layered attention network with the supervised model based on Support Vector Regression using a Multilayer Perceptron network for sentiment analysis.  ...  We then employ Knowledge Graph Embeddings to learn the representation of the triplet.  ... 
doi:10.18653/v1/n18-2041 dblp:conf/naacl/KumarKK18 fatcat:rctoab4wrrgpbdqzm6liok4tjy

Semi-Supervised Clustering With Multiresolution Autoencoders

Dino Ienco, Ruggero G. Pensa
2018 2018 International Joint Conference on Neural Networks (IJCNN)  
Our algorithm employs a multiresolution strategy to generate an ensemble of semi-supervised autoencoders that fit the data together with the background knowledge.  ...  Usually, in the semi-supervised clustering setting, the background knowledge is converted to some kind of constraint and, successively, metric learning or constrained clustering are adopted to obtain the  ...  low-dimensional embeddings.  ... 
doi:10.1109/ijcnn.2018.8489353 dblp:conf/ijcnn/IencoP18 fatcat:yjqx44c4bngezmyg32zxpid22u

Spectral Clustering via Ensemble Deep Autoencoder Learning (SC-EDAE) [article]

Severine Affeldt, Lazhar Labiod, Mohamed Nadif
2019 arXiv   pre-print
These approaches follow either a sequential way, where a deep representation is learned using a deep autoencoder before obtaining clusters with k-means, or a simultaneous way, where deep representation  ...  To alleviate the impact of such hyperparameters setting on the clustering performance, we propose a new model which combines the spectral clustering and deep autoencoder strengths in an ensemble learning  ...  In a second step, we construct a graph matrix S associated to each embedding Y , and then fuse the m graph matrices in an ensemble graph matrix S which contains information provided by the m embeddings  ... 
arXiv:1901.02291v2 fatcat:vzzjrn6345bsriymwnktejm4py

Knowledge-enriched Two-layered Attention Network for Sentiment Analysis [article]

Abhishek Kumar, Daisuke Kawahara, Sadao Kurohashi
2018 arXiv   pre-print
The novel two-layered attention network takes advantage of the external knowledge bases to improve the sentiment prediction. It uses the Knowledge Graph Embedding generated using the WordNet.  ...  We build our model by combining the two-layered attention network with the supervised model based on Support Vector Regression using a Multilayer Perceptron network for sentiment analysis.  ...  We then employ Knowledge Graph Embeddings to learn the representation of the triplet.  ... 
arXiv:1805.07819v4 fatcat:xf7dmrahybcuxlzzk5wcer7kuy

MultiWalk: A Framework to Generate Node Embeddings Based on an Ensemble of Walk Methods [article]

Kaléu Delphino
2021 arXiv   pre-print
Graph embeddings are low dimensional representations of nodes, edges or whole graphs.  ...  Such representations allow for data in a network format to be used along with machine learning models for a variety of tasks (e.g., node classification), where using a similarity matrix would be impractical  ...  Embeddings are projections of graph data into a -dimensional space.  ... 
arXiv:2102.11691v1 fatcat:jmp35wwf6zgsbcwfimisd4gfti

Inductive Relation Prediction by Subgraph Reasoning [article]

Komal K. Teru, Etienne Denis, William L. Hamilton
2020 arXiv   pre-print
We also demonstrate significant gains obtained by ensembling GraIL with various knowledge graph embedding methods in the transductive setting, highlighting the complementary inductive bias of our method  ...  The dominant paradigm for relation prediction in knowledge graphs involves learning and operating on latent representations (i.e., embeddings) of entities and relations.  ...  In particular, we demonstrated, with a thorough set of experiments, performance boosts to various knowledge graph embedding methods when ensembled with GraIL.  ... 
arXiv:1911.06962v2 fatcat:xfm4ctvjenhslcdclpoqmi7axi

Constructing Knowledge Graphs and Their Biomedical Applications

David Nicholson, Casey S. Greene
2020 Computational and Structural Biotechnology Journal  
A number of techniques are used to represent knowledge graphs, but often machine learning methods are used to construct a low-dimensional representation that can support many different applications.  ...  Advances in machine learning for biomedicine are creating new opportunities across many domains, and we note potential avenues for future work with knowledge graphs that appear particularly promising.  ...  In many cases, solutions rely on representing knowledge graphs in a low dimensional space, which is a process called representational learning.  ... 
doi:10.1016/j.csbj.2020.05.017 pmid:32637040 pmcid:PMC7327409 fatcat:eontflxz3fggdnw3jzajzr2bdu

Fine-Grained Evaluation of Rule- and Embedding-Based Systems for Knowledge Graph Completion [chapter]

Christian Meilicke, Manuel Fink, Yanjie Wang, Daniel Ruffinelli, Rainer Gemulla, Heiner Stuckenschmidt
2018 Lecture Notes in Computer Science  
Over the recent years, embedding methods have attracted increasing focus as a means for knowledge graph completion. Similarly, rule-based systems have been studied for this task in the past.  ...  Motivated by these insights, we combine both families of approaches via ensemble learning. The results support our assumption that the two methods complement each other in a beneficial way.  ...  Recently, a new family of models for knowledge graph completion has received increasing attention. These models are based on embedding the knowledge graph into a low dimensional space.  ... 
doi:10.1007/978-3-030-00671-6_1 fatcat:ejsbyhthtjbcbj6uxzk2nl3ppy

Probability Calibration for Knowledge Graph Embedding Models [article]

Pedro Tabacof, Luca Costabello
2020 arXiv   pre-print
Knowledge graph embedding research has overlooked the problem of probability calibration. We show popular embedding models are indeed uncalibrated.  ...  We present a novel method to calibrate a model when ground truth negatives are not available, which is the usual case in knowledge graphs.  ...  The authors propose to use ensembles in order to improve the results of knowledge graph embedding tasks.  ... 
arXiv:1912.10000v2 fatcat:djdcvrseibaexiyy7eqe6o3fcy

MARINE: Multi-relational Network Embeddings with Relational Proximity and Node Attributes

Noriaki Kawamae
2019 The World Wide Web Conference on - WWW '19  
We also extend the framework to incorporate existing features of nodes in a graph, which can further be exploited for the ensemble of embedding.  ...  Network embedding aims at learning an effective vector transformation for entities in a network.  ...  edge relation and the node attributes information in a low-dimensional embedding space.  ... 
doi:10.1145/3308558.3313715 dblp:conf/www/Kawamae19 fatcat:mt5o5joidvgapgmf4lain3t3be

Semisupervised Deep Embedded Clustering with Adaptive Labels

Zhikui Chen, Chaojie Li, Jing Gao, Jianing Zhang, Peng Li, Boxiang Dong
2021 Scientific Programming  
knowledge.  ...  To tackle this challenge, a semisupervised deep embedded clustering algorithm with adaptive labels is proposed to cluster those data in a semisupervised end-to-end manner on the basis of a little priori  ...  It can well alleviate the degradation of traditional clustering in the face of high-dimensional input data by learning low-dimensional representations of data. For example, Lv et al.  ... 
doi:10.1155/2021/6613452 fatcat:wg7z2f3wgvgp3ksiduls5wrsxu
« Previous Showing results 1 — 15 out of 7,184 results