A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Heterogeneous Graph Transformer for Graph-to-Sequence Learning
2020
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
unpublished
The graph-to-sequence (Graph2Seq) learning aims to transduce graph-structured representations to word sequences for text generation. Recent studies propose various models to encode graph structure. ...
In this paper, we propose the Heterogeneous Graph Transformer to independently model the different relations in the individual subgraphs of the original graph, including direct relations, indirect relations ...
We thank the anonymous reviewers for their helpful comments. Xiaojun Wan is the corresponding author. ...
doi:10.18653/v1/2020.acl-main.640
fatcat:izffulszyvaelpw3aridsb7m54
HetEmotionNet: Two-Stream Heterogeneous Graph Recurrent Neural Network for Multi-modal Emotion Recognition
[article]
2021
arXiv
pre-print
Each stream is composed of the graph transformer network for modeling the heterogeneity, the graph convolutional network for modeling the correlation, and the gated recurrent unit for capturing the temporal ...
However, it is challenging to make full use of the complementarity among spatial-spectral-temporal domain features for emotion recognition, as well as model the heterogeneity and correlation among multi-modal ...
We are grateful for supporting from Swarma-Kaifeng Workshop which is sponsored by Swarma Club and Kaifeng Foundation. ...
arXiv:2108.03354v1
fatcat:o3esloogcfewddsmyzu2gv3tu4
Session-based Recommendation with Heterogeneous Graph Neural Network
[article]
2021
arXiv
pre-print
In this paper, we propose a heterogeneous graph neural network-based session recommendation method, named SR-HetGNN, which can learn session embeddings by heterogeneous graph neural network (HetGNN), and ...
Specifically, SR-HetGNN first constructs heterogeneous graphs containing various types of nodes according to the session sequence, which can capture the dependencies among items, users, and sessions. ...
the session sequence into a homogenous graph and uses Graph Neural Network(GNN) to learn item embeddings. ...
arXiv:2108.05641v1
fatcat:zid77tubpncejp3kyh43ptlwi4
motif2vec: Motif Aware Node Representation Learning for Heterogeneous Networks
[article]
2019
arXiv
pre-print
Unlike previous efforts that uses different graph meta-structures to guide the random walk, we use graph motifs to transform the original network and preserve the heterogeneity. ...
We propose a novel efficient algorithm, motif2vec that learns node representations or embeddings for heterogeneous networks. ...
to transform the graph to a motif graph, which in turn, encode the heterogeneity. ...
arXiv:1908.08227v1
fatcat:v2rz4tieyzh4pkr6wajqdfunfq
Metapaths guided Neighbors aggregated Network for?Heterogeneous Graph Reasoning
[article]
2021
arXiv
pre-print
Heterogeneous graph embedding is to learn the structure and semantic information from the graph, and then embed it into the low-dimensional node representation. ...
To address these limitations, we propose a Metapaths-guided Neighbors-aggregated Heterogeneous Graph Neural Network(MHN) to improve performance. ...
In order to address these limitations, we propose a Metapathsguided Neighbors-aggregated Heterogeneous Graph Neural Network(MHN) model for heterogeneous graph embedding learning. ...
arXiv:2103.06474v1
fatcat:rt7lwzapebccrg3ta4zrztwc4u
Sequential Recommendation through Graph Neural Networks and Transformer Encoder with Degree Encoding
2021
Algorithms
on the graphs constructed under the heterogeneous information networks in an end-to-end fashion through a graph convolutional network (GCN) with degree encoding, while the capturing long-range dependencies ...
In this task, learning sequence representation by modeling the pairwise relationship between items in the sequence to capture their long-range dependencies is crucial. ...
On the basis of using graph neural networks to capture the user's short-term behavior preferences, the transformer encoder structure with a degree encoder is utilized for further learning the long-term ...
doi:10.3390/a14090263
fatcat:a2aik27cojejnja6lvh77epqum
Learning to Represent Programs with Heterogeneous Graphs
[article]
2022
arXiv
pre-print
A group of works add additional edges to ASTs to convert source code into graphs and use graph neural networks to learn representations for program graphs. ...
To address the information of node and edge types, we bring the idea of heterogeneous graphs to learning on source code and present a new formula of building heterogeneous program graphs from ASTs with ...
We also would like to thank all the anonymous reviewers for constructive comments and suggestions to this paper. ...
arXiv:2012.04188v3
fatcat:h4smy27w5fgsphx7yxrdoipwhq
HETFORMER: Heterogeneous Transformer with Sparse Attention for Long-Text Extractive Summarization
[article]
2021
arXiv
pre-print
Specifically, we model different types of semantic nodes in raw text as a potential heterogeneous graph and directly learn heterogeneous relationships (edges) among nodes by Transformer. ...
To mitigate these issues, this paper proposes HETFORMER, a Transformer-based pre-trained model with multi-granularity sparse attentions for long-text extractive summarization. ...
Acknowledgements We would like to thank all the reviewers for their helpful comments. This work is supported by NSF under grants III-1763325, III-1909323, III-2106758, and SaTC-1930941. ...
arXiv:2110.06388v2
fatcat:jn3qpnqq7ffvrgjb7nbtu6zusq
Curriculum Pre-Training Heterogeneous Subgraph Transformer for Top-N Recommendation
[article]
2021
arXiv
pre-print
Then we capture the rich semantics (graph structure and path semantics) within the subgraph via a heterogeneous subgraph Transformer, where we encode the subgraph with multi-slot sequence representations ...
To address the above issue, we propose a Curriculum pre-training based HEterogeneous Subgraph Transformer (called CHEST) with new data characterization, representation model and learning algorithm. ...
Then, we propose a heterogeneous subgraph Transformer to encode the subgraphs with multi-slot sequence representations. ...
arXiv:2106.06722v1
fatcat:5sxsricufveodho3fblxxrhb2a
Uniting Heterogeneity, Inductiveness, and Efficiency for Graph Representation Learning
[article]
2021
arXiv
pre-print
However, a majority class of GNNs are only designed for homogeneous graphs, leading to inferior adaptivity to the more informative heterogeneous graphs with various types of nodes and edges. ...
in graph representation learning. ...
heterogeneous graph embedding approach. • HGT: With a relation-aware transformer model and a subgraph sampling strategy, the heterogeneous graph transformer [7] is the latest graph embedding model for ...
arXiv:2104.01711v2
fatcat:uozthcnesjbszdf2ciauvvgrai
Levi Graph AMR Parser using Heterogeneous Attention
[article]
2021
arXiv
pre-print
This paper presents a novel approach to AMR parsing by combining heterogeneous data (tokens, concepts, labels) as one input to a transformer to learn attention, and use only attention matrices from the ...
transformer to predict all elements in AMR graphs (concepts, arcs, labels). ...
The heterogeneous nature of node sequences from Levi graphs allows our Graph Transformer to learn attentions among 3 types of input, tokens, concepts, and labels, leading to more informed predictions. ...
arXiv:2107.04152v1
fatcat:245h5dumffbvdbjd37q7swo2cq
Graph Transformer Networks
[article]
2020
arXiv
pre-print
Graph Transformer layer, a core layer of GTNs, learns a soft selection of edge types and composite relations for generating useful multi-hop connections so-called meta-paths. ...
, while learning effective node representation on the new graphs in an end-to-end fashion. ...
Here, we develop Graph Transformer Network (GTN) that learns to transform a heterogeneous input graph into useful meta-path graphs for each task and learn node representation on the graphs in an end-to-end ...
arXiv:1911.06455v2
fatcat:r7eaw4s4m5acffzase6w3isjli
From Statistical Relational to Neuro-Symbolic Artificial Intelligence
2020
Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Neuro-symbolic and statistical relational artificial intelligence both integrate frameworks for learning with logical reasoning. ...
These cannot only be used to characterize and position neuro-symbolic artificial intelligence approaches but also to identify a number of directions for further research. ...
For example, one direction of attempts has been made to leverage random walks to transform graph structures into sequences, which can be consumed by sequence-based embedding learning algorithms, such as ...
doi:10.24963/ijcai.2020/677
dblp:conf/ijcai/DongHWS020
fatcat:srd5r66dovefrpa5drxmrek25e
Pre-Trained Models for Heterogeneous Information Networks
[article]
2021
arXiv
pre-print
In network representation learning we learn how to represent heterogeneous information networks in a low-dimensional space so as to facilitate effective search, classification, and prediction solutions ...
Unlike traditional network representation learning models that have to train the entire model all over again for every downstream task and dataset, PF-HIN only needs to fine-tune the model and a small ...
Heterogeneous node sequence generation We first transform the structure of a node's neighborhood to a sequence of length k. ...
arXiv:2007.03184v2
fatcat:mne2ttzszrgpdpqs5oxo72bl24
Syntax-informed Question Answering with Heterogeneous Graph Transformer
[article]
2022
arXiv
pre-print
We present a linguistics-informed question answering approach that extends and fine-tunes a pre-trained transformer-based neural language model with symbolic knowledge encoded with a heterogeneous graph ...
transformer. ...
Any opinions, Syntax-informed Question Answering with Heterogeneous Graph Transformer 13 findings and conclusions or recommendations expressed in this material are those of the authors and do not reflect ...
arXiv:2204.09655v2
fatcat:r7guxjblk5fapf2mvl2obppcsy
« Previous
Showing results 1 — 15 out of 44,347 results