Equivariant and Stable Positional Encoding for More Powerful Graph Neural Networks [article]

Haorui Wang, Haoteng Yin, Muhan Zhang, Pan Li
2022 arXiv   pre-print
Graph neural networks (GNN) have shown great advantages in many graph-based learning tasks but often fail to predict accurately for a task-based on sets of nodes such as link/motif prediction and so on. Many works have recently proposed to address this problem by using random node features or node distance features. However, they suffer from either slow convergence, inaccurate prediction, or high complexity. In this work, we revisit GNNs that allow using positional features of nodes given by
more » ... itional encoding (PE) techniques such as Laplacian Eigenmap, Deepwalk, etc. GNNs with PE often get criticized because they are not generalizable to unseen graphs (inductive) or stable. Here, we study these issues in a principled way and propose a provable solution, a class of GNN layers termed PEG with rigorous mathematical analysis. PEG uses separate channels to update the original node features and positional features. PEG imposes permutation equivariance w.r.t. the original node features and imposes O(p) (orthogonal group) equivariance w.r.t. the positional features simultaneously, where p is the dimension of used positional features. Extensive link prediction experiments over 8 real-world networks demonstrate the advantages of PEG in generalization and scalability.
arXiv:2203.00199v5 fatcat:azi43wku5bdspkegly3a5ogfwe