Filters








15,905 Hits in 8.5 sec

Improving Inductive Link Prediction Using Hyper-Relational Facts [article]

Mehdi Ali, Max Berrendorf, Mikhail Galkin, Veronika Thost, Tengfei Ma, Volker Tresp, Jens Lehmann
2021 arXiv   pre-print
In this work, we classify different inductive settings and study the benefits of employing hyper-relational KGs on a wide range of semi- and fully inductive link prediction tasks powered by recent advancements  ...  For many years, link prediction on knowledge graphs (KGs) has been a purely transductive task, not allowing for reasoning on unseen entities.  ...  -Our experiments suggest that models supporting hyper-relational facts indeed improve link prediction in both inductive settings compared to strong triple-only baselines by more than 6% Hits@10.  ... 
arXiv:2107.04894v1 fatcat:knfcrakjwzh7lavf2q5oewj7wa

Node Classification Meets Link Prediction on Knowledge Graphs [article]

Ralph Abboud, İsmail İlkan Ceylan
2021 arXiv   pre-print
By contrast, link prediction models are solely motivated by relational incompleteness of the input graphs, and do not typically leverage node features or classes.  ...  Node classification and link prediction are widely studied in graph representation learning.  ...  Relational inductive bias. MLP-X maintains the relational inductive bias of shallow embedding models.  ... 
arXiv:2106.07297v2 fatcat:hqayfccwa5dphalsvac527xnti

Revisiting Unsupervised Relation Extraction [article]

Thy Thy Tran, Phong Le, Sophia Ananiadou
2020 arXiv   pre-print
However, we demonstrate that by using only named entities to induce relation types, we can outperform existing methods on two popular datasets.  ...  We conclude that entity types provide a strong inductive bias for URE.  ...  Acknowledgments We would like to thank the reviewers for their comments, Diego Marcheggiani for sharing his dataset with us, andÉtienne Simon for sharing the hyperparameters.  ... 
arXiv:2005.00087v1 fatcat:tsp6una42rdflmw66u5g6dhlta

KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation [article]

Xiaozhi Wang, Tianyu Gao, Zhaocheng Zhu, Zhengyan Zhang, Zhiyuan Liu, Juanzi Li, Jian Tang
2020 arXiv   pre-print
Experimental results show that KEPLER achieves state-of-the-art performances on various NLP tasks, and also works remarkably well as an inductive KE model on KG link prediction.  ...  In contrast, knowledge embedding (KE) methods can effectively represent the relational facts in knowledge graphs (KGs) with informative entity embeddings, but conventional KE models cannot take full advantage  ...  KEPLER-Cond uses the entity-embeddingconditioned-on-relation method (Equation 5 ). This model achieves superior results in link prediction tasks, both transductive and inductive (Section 4.3).  ... 
arXiv:1911.06136v3 fatcat:efa4lpu7iff4fmuj7u7ooiabwq

Semi-Supervised Learning with Declaratively Specified Entropy Constraints [article]

Haitian Sun, William W. Cohen, Lidong Bing
2018 arXiv   pre-print
We show consistent improvements on a suite of well-studied SSL benchmarks, including a new state-of-the-art result on a difficult relation extraction task.  ...  The proposed method can be used to specify ensembles of semi-supervised learning, as well as agreement constraints and entropic regularization constraints between these learners, and can be used to model  ...  Many algorithms are proposed to exploit such link structure to improve the prediction accuracy.  ... 
arXiv:1804.09238v2 fatcat:prq6iil3bbhbfgawys2uvyxeqq

Subgraph-aware Few-Shot Inductive Link Prediction via Meta-Learning [article]

Shuangjia Zheng, Sijie Mai, Ya Sun, Haifeng Hu, Yuedong Yang
2021 arXiv   pre-print
In this way, we find the model can quickly adapt to few-shot relationships using only a handful of known facts with inductive settings.  ...  Link prediction for knowledge graphs aims to predict missing connections between entities. Prevailing methods are limited to a transductive setting and hard to process unseen entities.  ...  RELATED WORK Inductive relation prediction.  ... 
arXiv:2108.00954v1 fatcat:2hqxng3djzcdxdcd4hrdu5ybda

Inductive Learning on Commonsense Knowledge Graph Completion [article]

Bin Wang, Guangtao Wang, Jing Huang, Jiaxuan You, Jure Leskovec, C.-C. Jay Kuo
2021 arXiv   pre-print
InductivE performs especially well on inductive scenarios where it achieves above 48% improvement over present methods.  ...  Different from previous approaches, InductiveE ensures the inductive learning capability by directly computing entity embeddings from raw entity attributes/text.  ...  Therefore, predicting missing facts is one of the most fundamental problems in this field.  ... 
arXiv:2009.09263v2 fatcat:fjwqsjcl7rdvnn262ptcr5poma

Link mining

Lise Getoor
2003 SIGKDD Explorations  
A key challenge for data mining is tackling the problem of mining richly structured datasets, where the objects are linked in some way.  ...  Links among the objects may demonstrate certain patterns, which can be helpful for many data mining tasks and are usually hard to capture with traditional statistical models.  ...  Clearly, this is information that can be used to improve the predictive accuracy of the learned models: attributes of linked objects are often correlated and links are more likely to exist between objects  ... 
doi:10.1145/959242.959253 fatcat:th3ijcstpvdyzbaxq7payz2vle

Logic Attention Based Neighborhood Aggregation for Inductive Knowledge Graph Embedding [article]

Peifeng Wang, Jialong Han, Chenliang Li, Rong Pan
2020 arXiv   pre-print
Recent efforts on this issue suggest training a neighborhood aggregator in conjunction with the conventional entity and relation embeddings, which may help embed new entities inductively via their existing  ...  Knowledge graph embedding aims at modeling entities and relations with low-dimensional vectors.  ...  Experiments on Link Prediction Link prediction in the inductive setting aims at reasoning the missing part "?" in a triplet when given (s, r, ?) or (?  ... 
arXiv:1811.01399v2 fatcat:pql27ichrjduvm2uosoo64kct4

Logic Attention Based Neighborhood Aggregation for Inductive Knowledge Graph Embedding

Peifeng Wang, Jialong Han, Chenliang Li, Rong Pan
2019 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
Recent efforts on this issue suggest training a neighborhood aggregator in conjunction with the conventional entity and relation embeddings, which may help embed new entities inductively via their existing  ...  Knowledge graph embedding aims at modeling entities and relations with low-dimensional vectors.  ...  Experiments on Link Prediction Link prediction in the inductive setting aims at reasoning the missing part "?" in a triplet when given (s, r, ?) or (?  ... 
doi:10.1609/aaai.v33i01.33017152 fatcat:dxnkl5xz4bg5jerezwolta2thi

Editorial

2016 Intelligent Data Analysis  
the mainstream link prediction baseline methods.  ...  This reformulation allows one to use local search procedures designed for UBQP in order to improve the solutions of BMF.  ... 
doi:10.3233/ida-160828 fatcat:63hki4flcnc7fm53ugfctvtyk4

Hypergraph Pre-training with Graph Neural Networks [article]

Boxin Du, Changhe Yuan, Robert Barton, Tal Neiman, Hanghang Tong
2021 arXiv   pre-print
Third, the proposed framework can work in both transductive and inductive settings.  ...  The extensive experimental results demonstrate that: (1) HyperGene achieves up to 5.69% improvements in hyperedge classification, and (2) improves pre-training efficiency by up to 42.80% on average.  ...  For baselines, Hyper-SAGNN, SAGE, DHNE and Joint Training use inductive setting, and DW and Deep-Hyperedge use transductive setting. From the table, we can make the following observations.  ... 
arXiv:2105.10862v1 fatcat:m6lairp6mbgoph4g7fcgamq4lm

Layer-stacked Attention for Heterogeneous Network Embedding [article]

Nhat Tran, Jean Gao
2020 arXiv   pre-print
Recent graph neural network approaches for representation learning on heterogeneous networks typically employ the attention mechanism, which is often only optimized for predictions based on direct links  ...  In both transductive and inductive node classification tasks, LATTE can achieve state-of-the-art performance compared to existing approaches, all while offering a lightweight model.  ...  The benefits of the mechanism proposed are not only to improve inductive node classification performance but also to improve interpretation of deep GNN models.  ... 
arXiv:2009.08072v1 fatcat:rquxsvnmire6fkwudogq5cszpy

Prix-LM: Pretraining for Multilingual Knowledge Base Construction [article]

Wenxuan Zhou, Fangyu Liu, Ivan Vulić, Nigel Collier, Muhao Chen
2021 arXiv   pre-print
Experiments on standard entity-related tasks, such as link prediction in multiple languages, cross-lingual entity linking and bilingual lexicon induction, demonstrate its effectiveness, with gains reported  ...  Prix-LM integrates useful multilingual and KB-based factual knowledge into a single model.  ...  and 4) Bilingual lexicon induction (BLI) can be very useful for multilingual KB construction as they help to find cross-lingual entity links.  ... 
arXiv:2110.08443v1 fatcat:7kdeih45nbhpvmq2bouzlre5ja

Adaptive Multi-grained Graph Neural Networks [article]

Zhiqiang Zhong, Cheng-Te Li, Jun Pang
2020 arXiv   pre-print
More specifically, a differentiable pooling operator in AdamGNN is used to obtain a multi-grained structure that involves node-wise and meso/macro level semantic information.  ...  Note that, for link prediction task, an equal number of nonexistent links used as a supplementary part for every set.  ...  The improvement of our AdamGNN is more appealing in the link prediction task, e.g., achieving 54.88% improvement compared with the flat GNN models on the Wiki dataset.  ... 
arXiv:2010.00238v2 fatcat:ebfwp3524va6bbrucibspbq7sy
« Previous Showing results 1 — 15 out of 15,905 results