Ï㽶ÊÓƵ

Event

Martina Neuman (Michigan State University)

Monday, January 30, 2023 16:30to17:30
Burnside Hall Room 708, 805 rue Sherbrooke Ouest, Montreal, QC, H3A 0B9, CA

Abstract: Graph Neural Network (GNN) with its ability to integrate graph information has been widely used for data analyses. However, the expressive power of GNN has only been studied for graph-level tasks but not for node-level tasks, such as node classification, where one tries to interpolate missing nodal labels from the observed ones. In this paper, we study the expressive power of GNN for the said classification task, which is in essence a function interpolation problem. Explicitly, we derive the number of weights and layers needed for a GNN to interpolate a band-limited function in \mathbb{R}^d. Our result shows that, the number of weights needed to \epsilon-approximate a bandlimited function using the GNN architecture is much fewer than the best known one using a fully connected neural network (NN) - in particular, one only needs O((\log \epsilon^{-1})^{d})weights using a GNN trained by O((\log \epsilon^{-1})^{d})samples to \epsilon-approximate a discretized bandlimited signal in \mathbb{R}^d. The result is obtained by drawing a connection between the GNN structure and the classical sampling theorems, making our work the first attempt in this direction.

Follow us on

Back to top