A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Filters
Hotel Review Classification Based on the Text Pretraining Heterogeneous Graph Neural Network Model
2022
Computational Intelligence and Neuroscience
The model combines a pretrained text model named Bidirectional Encoder Representation from Transformers (BERT) and heterogeneous graph attention network (HGAN). ...
Then, we construct a heterogeneous neural network and apply the attention mechanism to it to mine the preference of users for traveling. ...
[23] proposed the heterogeneous graph neural network model heterogeneous graph attention network (HAN) based on hierarchical attention. ...
doi:10.1155/2022/5259305
pmid:35300392
pmcid:PMC8923762
fatcat:p7pefn7gcren7gjef4zkza7y3q
Enhancing Extractive Text Summarization with Topic-Aware Graph Neural Networks
[article]
2020
arXiv
pre-print
To address these issues, this paper proposes a graph neural network (GNN)-based extractive summarization model, enabling to capture inter-sentence relationships efficiently via graph-structured document ...
Moreover, our model integrates a joint neural topic model (NTM) to discover latent topics, which can provide document-level features for sentence selection. ...
Second, we build a heterogeneous document graph consisting of sentence and topic nodes, and simultaneously update their representations with a modified graph attention network (GAT; Veličković et al., ...
arXiv:2010.06253v1
fatcat:kxlj4h2cszhcvh3uis5vlfs7ni
Heterogeneous Graph Attention Networks for Semi-supervised Short Text Classification
2019
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
Then, we propose Heterogeneous Graph ATtention networks (HGAT) to embed the HIN for short text classification based on a dual-level attention mechanism, including node-level and type-level attentions. ...
In particular, we first present a flexible HIN (heterogeneous information network) framework for modeling the short texts, which can integrate any type of additional information as well as capture their ...
graph neural networks on the HIN for semi-supervised classification. 2) We propose novel heterogeneous graph attention networks (HGAT) for the HIN embedding based on a new dual-level attention mechanism ...
doi:10.18653/v1/d19-1488
dblp:conf/emnlp/HuYSJL19
fatcat:kc6k7t335rfr5lvmprz22t2enm
Topic-to-Essay Generation with Comprehensive Knowledge Enhancement
[article]
2021
arXiv
pre-print
Thus, a topic-to-essay generation model with comprehensive knowledge enhancement, named TEGKE, is proposed. ...
For external knowledge enhancement, a topic knowledge graph encoder is proposed. ...
The attention scores over the topic knowledge graph are shown on the left side. Deeper green indicates higher attention scores. ...
arXiv:2106.15142v1
fatcat:w3pjqzq5ajfzrp2o4xypf4v3nm
Reviews Meet Graphs: Enhancing User and Item Representations for Recommendation with Hierarchical Attentive Graph Neural Network
2019
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
In the graph-view, we propose a hierarchical graph neural network to jointly model the user-item, user-user and item-item relatedness by capturing the first-and secondorder interactions between users and ...
In addition, we apply attention mechanism to model the importance of these interactions to learn informative user and item representations. ...
Thus, we propose to use an attentive graph neural network to model the importance of the users that connected to the item node. ...
doi:10.18653/v1/d19-1494
dblp:conf/emnlp/WuWQGHX19
fatcat:insvhrz6kzce7dsyhx63jrcepy
Graph Neural News Recommendation with Long-term and Short-term Interest Modeling
[article]
2019
arXiv
pre-print
In this paper, we propose to build a heterogeneous graph to explicitly model the interactions among users, news and latent topics. ...
We also consider a user's short-term interest using the recent reading history with an attention based LSTM model. ...
Our graph contains three types of nodes: users U , news items I and topics Z. The topics Z can be mined through the topic model LDA [29] . ...
arXiv:1910.14025v2
fatcat:unpsq7kqynfqjlof3nxyq4uan4
Improving Cyberbully Detection with User Interaction
[article]
2020
arXiv
pre-print
Drawing on recent advances in graph neural network, we then propose a principled approach for modeling the temporal dynamics and topic coherence throughout user interactions. ...
We also show that modeling such topic coherence and temporal interaction are critical to capture the repetitive characteristics of bullying behavior, thus leading to better predicting performance. ...
This is in part because our approach effectively models user interactions in a graph attention neural network with a focus on temporal dynamics and topic coherence of a session. ...
arXiv:2011.00449v1
fatcat:qzo4sw27m5ca7j7ee4kg3qwiyy
DynGraph2Seq: Dynamic-Graph-to-Sequence Interpretable Learning for Health Stage Prediction in Online Health Forums
[article]
2019
arXiv
pre-print
We go on to propose dynamic graph hierarchical attention mechanisms to facilitate the necessary multi-level interpretability. ...
problem, and hence propose a novel dynamic graph-to-sequence neural networks architecture (DynGraph2Seq) to address all the challenges. ...
Dynamic Graph Representation Learning As an emerging topic in the graph representation learning domain, dynamic graph learning has attracted a great deal of attention from researchers in recent years ...
arXiv:1908.08497v1
fatcat:u2mc6ig7ubaofnr443rcjyweqa
Research on Domain Information Mining and Theme Evolution of Scientific Papers
[article]
2022
arXiv
pre-print
This paper introduces the research status at home and abroad in terms of domain information mining and topic evolution law of scientific and technological papers from three aspects: the semantic feature ...
representation learning of scientific and technological papers, the field information mining of scientific and technological papers, and the mining and prediction of research topic evolution rules of ...
dependencies of research topics, and the representation learning of graph states [37][49] has gradually attracted attention. ...
arXiv:2204.08476v1
fatcat:7cte3exhajbilbkvhktjgyvqha
An Introduction to Graph Neural Networks
2022
Proceedings of the ... International Florida Artificial Intelligence Research Society Conference
Researchers have been working to adapt neural networks to operate on graph data for more than a decade. ...
Graph Neural Networks (GNNs) are considered a subset of deep learning methods designed to extract important information and make useful predictions on graph representations. ...
This tutorial will cover relevant GNN-related topics, including the basics of learning on graph structured data, graph embeddings, attention networks, aggregation functions and examples of applications ...
doi:10.32473/flairs.v35i.130613
fatcat:q5xzimhnqzeuvkfdffehpc3efi
Personalized News Recommendation: Methods and Challenges
[article]
2022
arXiv
pre-print
MVL [134] uses attention networks to learn user interest representations in a content view, and uses a graph attention network to model user interest from the user-news graph in a graph view. ...
It predicts the topic of news based on texts and concepts, and used the predicted topic to enrich the knowledge graph and learn topic enriched knowledge representations of news with graph neural networks ...
arXiv:2106.08934v3
fatcat:iagqsw73hrehxaxpvpydvtr26m
Measuring Time-Sensitive and Topic-Specific Influence in Social Networks with LSTM and Self-Attention
2020
IEEE Access
Most existing approaches can only model the influence based on static social network structures and topic distributions. ...
To address these challenges, we propose a Time-sensitive and Topic-specific Influence Measurement (TTIM) method, to jointly model the streaming texts and dynamic social networks. ...
Inspired by Graph Attention Networks (GAT) [18] , [21] and DeepInf [10] , we propose the influence attention network, which can aggregate the node topic distribution with attention on the node's local ...
doi:10.1109/access.2020.2991683
pmid:32577335
pmcid:PMC7311102
fatcat:jbptmhhp65gadp435jmizzaqyy
A Knowledge Driven Dialogue Model with Reinforcement Learning
2020
IEEE Access
Our experiments clearly demonstrate the superior performance of our model over other baselines. INDEX TERMS Dialogue model, policy gradient, knowledge graph, transformer network. ...
The proposed model is able to effectively complete the knowledge driven dialogue task with specific topic. ...
[10] proposed a static and dynamic graph attention mechanism to fuse the knowledge graph into the end-to-end chatbot model. ...
doi:10.1109/access.2020.2993924
fatcat:iqzjcl7erjbxrnyrx43kxfnjzy
Attention-based Graph Neural Network for Semi-supervised Learning
[article]
2018
arXiv
pre-print
Based on this insight, we propose a novel graph neural network that removes all the intermediate fully-connected layers, and replaces the propagation layers with attention mechanisms that respect the structure ...
Recently popularized graph neural networks achieve the state-of-the-art accuracy on a number of standard benchmark datasets for graph-based semi-supervised learning, improving significantly over existing ...
Dissection of Graph Neural Network In this section, we propose a novel Graph Neural Network (GNN) model which we call Attention-based Graph Neural Network (AGNN), and compare its performance to state-of-the-art ...
arXiv:1803.03735v1
fatcat:6bpomyo4mrfqrcwrvfk6d3klvm
Graph neural network for merger and acquisition prediction
2021
Proceedings of the Second ACM International Conference on AI in Finance
Our M&A prediction solution integrates with the topic model for text analysis, advanced feature engineering, and several tricks to boost GNN. ...
This paper investigates the application of graph neural networks (GNN) in Mergers and Acquisitions (M&A) prediction, which aims to quantify the relationship between companies, their founders, and investors ...
The algorithms cover the baseline model, graph neural network, graph neural network with XGBoost, graph neural network with topic model. ...
doi:10.1145/3490354.3494368
fatcat:lb4khcz6jfbf7ixuo7jz6q3nh4
« Previous
Showing results 1 — 15 out of 121,077 results