What functions can GNN compute on large random graphs?
In this talk, I will discuss some recent work about the properties of Graph Neural Networks (GNNs) on large graphs. Indeed, most existing analyses of GNNs are purely combinatorial and may fail to reflect their behavior regarding large-scale structures in graphs, such as communities of well-connected nodes or manifold structures. To address this, we assume that the graphs of interest are generated with classical models of random graphs. We first give non-asymptotic convergence bounds of GNNs toward ``continuous'' equivalents as the number of nodes grows. Then, we study their universality and approximation power, and show how some recent GNNs are more powerful than others. Finally, we characterize the role of recent methodologies to augment the input node features of the graph, and derive some concentration inequalities of independent interest. This is a joint work with Samuel Vaiter (CNRS) and Alberto Bietti (FlatIron).