Last time I explained how GNN works. It gathers information from neighbors and aggregates them to predict classes we are interested in. Today, I would like to go deeper with one of the most popular GNN called “GRAPH CONVOLUTIONAL NETWORKS” or “GCN”(1). Let us start step by step.
1. Adjacency Matrix
As you know, a graph has edges or links among nodes. Therefore, a graph can be specified with an adjacency matrix A and node features H. A adjacency matrix is very useful to present the relationship among nodes. If a graph has a link from node “i” to node “j”, the element of A, whose row is “i” and column is “j” is “1”, otherwise “0”. It is shown in the chart below. If a node has a link to itself, diagonal elements are “1” in adjacency matrix.
2. Gather information from neighbors
Let us explain the chart below. 1. The node, which is red in the chart, gathers information from each neighbor. 2. The information is aggregated to update the node. The way for aggregation can be sum or average.
As I said above, a graph can be specified with an adjacency matrix A and node features H or X. I introduce W as a matrix to show learnable weights and D is a matrix to show us the degree of A. It is noted that diagonal elements are “1” in adjacency matrix here. σ is a non linear function.
In GCN, the way to gather information is as follows (1).
It means that information from neighbors are weighted based on the degree of the sender (green one) and the receiver (red one) as well. All information is aggregated to update the receiver (red one).
In the formula below, GCN is also considered more generally that the features of the neighborhood are directly aggregated with fixed weights (2).
“Cuv” is considered as the importance of node v to node u. It is a constant that directly depends on the entries in an adjacency matrix A.
That’s it! Hope you can understand how GCN works. In the next article, I would like to solve the problem with GCN. Stay tuned!
(1) SEMI-SUPERVISED CLASSIFICATION WITH GRAPH CONVOLUTIONAL NETWORKS, Thomas N. Kipf & Max Welling, 22 Feb 2017
(2) Geometric Deep Learning Grids, Groups, Graphs, Geodesics, and Gauges, p79, Michael M. Bronstein, Joan Bruna, Taco Cohen, Petar Veličković, 4 May 2021
Notice: ToshiStats Co., Ltd. and I do not accept any responsibility or liability for loss or damage occasioned to any person or property through using materials, instructions, methods, algorithms or ideas contained herein, or acting or refraining from acting as a result of such use. ToshiStats Co., Ltd. and I expressly disclaim all implied warranties, including merchantability or fitness for any particular purpose. There will be no duty on ToshiStats Co., Ltd. and me to correct any errors or defects in the codes and the software.