A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Filters
Graph Neural Networks in Recommender Systems: A Survey
2022
ACM Computing Surveys
Recently, graph neural network (GNN) techniques have been widely utilized in recommender systems since most of the information in recommender systems essentially has graph structure and GNN has superiority ...
in graph representation learning. ...
two-level attention network for propagation. ...
doi:10.1145/3535101
fatcat:hgv2tbx3k5hzbnkupwsysqwjmy
Graph Neural Networks in Recommender Systems: A Survey
[article]
2022
arXiv
pre-print
Recently, graph neural network (GNN) techniques have been widely utilized in recommender systems since most of the information in recommender systems essentially has graph structure and GNN has superiority ...
in graph representation learning. ...
two-level attention network for propagation. ...
arXiv:2011.02260v4
fatcat:hvk22yyid5bzjnzmzchyti25ja
Feature-level Attentive ICF for Recommendation
[article]
2021
arXiv
pre-print
To demonstrate the effectiveness of our method, we design a light attention neural network to integrate both item-level and feature-level attention for neural ICF models. ...
In this work, we propose a general feature-level attention method for ICF models. ...
Different from this method, DisenHAN [63] learns the disentangled representations by aggregating aspect features from different meta relations in a heterogeneous information network (HIN). ...
arXiv:2102.10745v2
fatcat:wz3luznoqnfw5newwr3bco32za
DisenKGAT: Knowledge Graph Embedding with Disentangled Graph Attention Network
[article]
2021
pre-print
In this work, we propose a novel Disentangled Knowledge Graph Attention Network (DisenKGAT) for KGC, which leverages both micro-disentanglement and macro-disentanglement to exploit representations behind ...
For macro-disentanglement, we leverage mutual information as a regularization to enhance independence. ...
The above models develop the disentanglement idea in homogeneous networks, while DisenHAN [39] focus on Heterogeneous networks. ...
doi:10.1145/3459637.3482424
arXiv:2108.09628v1
fatcat:k2cekvcwqrfttpngeazvprbhsu