A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Graph Attention Networks
[article]
2018
arXiv
pre-print
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior methods ...
In this way, we address several key challenges of spectral-based graph neural networks simultaneously, and make our model readily applicable to inductive as well as transductive problems. ...
CONCLUSIONS We have presented graph attention networks (GATs), novel convolution-style neural networks that operate on graph-structured data, leveraging masked self-attentional layers. ...
arXiv:1710.10903v3
fatcat:6xbvqpuxsjfuhfo7vqcdjmefgq
How Attentive are Graph Attention Networks?
[article]
2022
arXiv
pre-print
Graph Attention Networks (GATs) are one of the most popular GNN architectures and are considered as the state-of-the-art architecture for representation learning with graphs. ...
Because GATs use a static attention mechanism, there are simple graph problems that GAT cannot express: in a controlled problem, we show that static attention hinders GAT from even fitting the training ...
GNN variants -Graph Attention Network (GAT). ...
arXiv:2105.14491v3
fatcat:pp4xs7d2bffo7jb2p2zn626lya
Graph Ordering Attention Networks
[article]
2022
arXiv
pre-print
Graph Neural Networks (GNNs) have been successfully used in many problems involving graph-structured data, achieving state-of-the-art performance. ...
This is achieved by learning local node orderings via an attention mechanism and processing the ordered representations using a recurrent neural network aggregator. ...
Graph Neural Networks. ...
arXiv:2204.05351v2
fatcat:ptxpj6ghh5aw5kcl6vifsf5ghu
Ensemble Graph Attention Networks
2022
Transactions on Machine Learning and Artificial Intelligence
We then propose two ensemble graph neural network models – Ensemble-GAT and Ensemble-HetGAT by applying the ensemble strategy to the graph attention network (GAT), and a heterogeneous graph attention network ...
Graph neural networks have demonstrated its success in many applications on graph-structured data. ...
Ensemble Graph Attention Networks. ...
doi:10.14738/tmlai.103.12399
fatcat:qivrmha2gjhwrft4ejhfomk5ti
Hyperbolic Graph Attention Network
[article]
2019
arXiv
pre-print
Graph neural network (GNN) has shown superior performance in dealing with graphs, which has attracted considerable research attention recently. ...
The comprehensive experimental results on four real-world datasets demonstrate the performance of our proposed hyperbolic graph attention network model, by comparisons with other state-of-the-art baseline ...
Acknowledgements Hyperbolic representation learning has attracted considerable research attention recently. ...
arXiv:1912.03046v1
fatcat:ag2elsxc4bdyfakvf72juawz2i
Heterogeneous Graph Attention Network
[article]
2021
arXiv
pre-print
In this paper, we first propose a novel heterogeneous graph neural network based on the hierarchical attention, including node-level and semantic-level attentions. ...
The heterogeneity and rich semantic information bring great challenges for designing a graph neural network for heterogeneous graph. ...
Graph Attention Network (GAT) [35] , a novel convolution-style graph neural network, leverages attention mechanism for the homogeneous graph which includes only one type of nodes or links. ...
arXiv:1903.07293v2
fatcat:wrwdajlnm5bahhh26ngd5z7pfe
Sparse Graph Attention Networks
[article]
2021
arXiv
pre-print
Among the variants of GNNs, Graph Attention Networks (GATs) learn to assign dense attention coefficients over all neighbors of a node for feature aggregation, and improve the performance of many graph ...
In this paper, we propose Sparse Graph Attention Networks (SGATs) that learn sparse attention coefficients under an L_0-norm regularization, and the learned sparse attentions are then used for all GNN ...
CONCLUSION In this paper we propose sparse graph attention networks (SGATs) that integrate a sparse attention mechanism into graph attention networks (GATs) via an L 0 -norm regularization on the number ...
arXiv:1912.00552v2
fatcat:vqvo7aty6beyphlgtgi5qsg7fq
Signed Graph Attention Networks
[article]
2019
arXiv
pre-print
In this paper, we propose Signed Graph Attention Networks (SiGATs), generalizing GAT to signed networks. ...
As a representative implementation of GNNs, Graph Attention Networks (GATs) are successfully applied in a variety of tasks on real datasets. ...
Signed Graph Attention Networks GAT [23] introduces attention mechanism into the graph. It uses a weight matrix to characterize the different effects of different nodes on the target node. ...
arXiv:1906.10958v3
fatcat:jx6eeecytfgm3d4d22rap2gh4a
Relational Graph Attention Networks
[article]
2019
arXiv
pre-print
We investigate Relational Graph Attention Networks, a class of models that extends non-relational graph attention mechanisms to incorporate relational information, opening up these methods to a wider variety ...
To provide a meaningful comparison, we retrain Relational Graph Convolutional Networks, the spectral counterpart of Relational Graph Attention Networks, and evaluate them under the same conditions. ...
Conclusion We have investigated a class of models we call Relational Graph Attention Networks (RGATs). ...
arXiv:1904.05811v1
fatcat:blea66xmsvflfffhbpmffpzwyu
Universal Graph Transformer Self-Attention Networks
[article]
2022
arXiv
pre-print
We introduce a transformer-based GNN model, named UGformer, to learn graph representations. ...
Experimental results demonstrate that the first UGformer variant achieves state-of-the-art accuracies on benchmark datasets for graph classification in both inductive setting and unsupervised transductive ...
It is worth noting that all links/interactions among all positions in the self-attention layer build up a complete network. ...
arXiv:1909.11855v13
fatcat:eqc3zguszvdhden5ycwdz75rze
Attention Guided Graph Convolutional Networks for Relation Extraction
[article]
2020
arXiv
pre-print
In this work, we propose Attention Guided Graph Convolutional Networks (AGGCNs), a novel model which directly takes full dependency trees as inputs. ...
In this paper, we propose the novel Attention Guided Graph Convolutional Networks (AGGCNs), which operate directly on the full tree. ...
Graph Convolutional Networks. Early efforts that attempt to extend neural networks to deal with arbitrary structured graphs are introduced by Gori et al. (2005); Bruna (2014) . ...
arXiv:1906.07510v8
fatcat:duxwaeueorb5dkhanhg2vlv32m
SGAT: Simplicial Graph Attention Network
[article]
2022
arXiv
pre-print
In this paper, we present Simplicial Graph Attention Network (SGAT), a simplicial complex approach to represent such high-order interactions by placing features from non-target nodes on the simplices. ...
To learn such complex semantics, many graph neural network approaches for heterogeneous graphs use metapaths to capture multi-hop interactions between nodes. ...
Recently, graph neural network (GNN), which learns tasks utilizing graph-structured data, has attracted much attention. ...
arXiv:2207.11761v1
fatcat:lonsnt54jjeg7gya6wyliwjuvu
Personalized PageRank Graph Attention Networks
[article]
2022
arXiv
pre-print
In this work, we incorporate the limit distribution of Personalized PageRank (PPR) into graph attention networks (GATs) to reflect the larger neighbor information without introducing over-smoothing. ...
There has been a rising interest in graph neural networks (GNNs) for representation learning over the past few years. ...
Graph Attention Networks Recently, attention networks have achieved state-of-the-art results in many tasks [14] . ...
arXiv:2205.14259v1
fatcat:xyccmktkobf4rdvtipxpsbog6m
Dual-Attention Graph Convolutional Network
[article]
2019
arXiv
pre-print
Graph convolutional networks (GCNs) have shown the powerful ability in text structure representation and effectively facilitate the task of text classification. ...
In this paper, we propose a dual-attention GCN to model the structural information of various texts as well as tackle the graph-invariant problem through embedding two types of attention mechanisms, i.e ...
Graph Convolutional Network. In recent years, graph convolutional networks (GCNs) gain more attention because of some unique advantages. ...
arXiv:1911.12486v1
fatcat:a5x4f3ypknhdrb7esmdgndpwzy
Context-Aware Graph Attention Networks
[article]
2019
arXiv
pre-print
In this paper, we propose a novel unified GNN model, named Context-aware Adaptive Graph Attention Network (CaGAT). ...
Graph Neural Networks (GNNs) have been widely studied for graph data representation and learning. ...
Conclusion In this paper, we propose a novel spatial graph neural network, named Context-aware Adaptive Graph Attention Network (CaGAT). ...
arXiv:1910.01736v1
fatcat:4g6t2tprananfh7k4dfjrxh7ze
« Previous
Showing results 1 — 15 out of 373,946 results