Learning by Transference: Training Graph Neural Networks on Growing Graphs [article]

Juan Cervino, Luana Ruiz, Alejandro Ribeiro
2022 arXiv   pre-print
Graph neural networks (GNNs) use graph convolutions to exploit network invariances and learn meaningful feature representations from network data. However, on large-scale graphs convolutions incur in high computational cost, leading to scalability limitations. Leveraging the graphon -- the limit object of a graph -- in this paper we consider the problem of learning a graphon neural network (WNN) -- the limit object of a GNN -- by training GNNs on graphs sampled from the graphon. Under
more » ... conditions, we show that: (i) the expected distance between the learning steps on the GNN and on the WNN decreases asymptotically with the size of the graph, and (ii) when training on a sequence of growing graphs, gradient descent follows the learning direction of the WNN. Inspired by these results, we propose a novel algorithm to learn GNNs on large-scale graphs that, starting from a moderate number of nodes, successively increases the size of the graph during training. This algorithm is further benchmarked on a decentralized control problem, where it retains comparable performance to its large-scale counterpart at a reduced computational cost.
arXiv:2106.03693v3 fatcat:gh6colp2lraa7gct257tmmdgky