Introduction to Graph theory

Why Graph Theory?

Graphs are the data structure used in the study of Graph neural networks (GNNs). The fundamental graph theory is important in understanding of GNN. Therefore lets explore graph theory in detail.

Basic Representation of the Graph

Directed vs Undirected graph

Algebra Representation of the Graph

Adjacency matrix

\[A_{ij}=\begin{cases} 1 & \text{if $\{v_i, v_j\} \in E$ and $i \ne j$},\\ 0 & \text{otherwise}. \end{cases}\]

Directed vs Undirected graph Figure3: Adjacency matrix

Degree matrix

\[D_{ii} = d(v_i), \\ D_{ij} = 0 \text{ where $i \ne j$}.\]

Incidence matrix

\[M_{ij}=\begin{cases} 1 & \text{$\exists k$ $s.t$ $e_j = \{v_i, v_k\}$},\\ -1 & \text{$\exists k$ $s.t$ $e_j = \{v_k, v_i\}$},\\ 0 & \text{otherwise}. \end{cases}\]

Incidence matrix Figure4: Incidence matrix

Laplacian matrix

\[L = D - A.\] \[L_{ij} = \begin{cases} (v_i) & \text{if $i = j$},\\ -1 & \text{if $\{v_i, v_j\} \in E$ and $i \ne j$},\\ 0 & \text{otherwise}. \end{cases}\]

Symmetric normalized Laplacian

\[L^{sym} = D^{-1/2}LD^{-1/2}\\ \qquad\qquad = I - D^{-1/2}AD^{-1/2}.\] \[L_{ij}^{sym} = \begin{cases} 1 & \text{if $i = j$ and $d(v_i) \ne 0$},\\ -\frac{1}{\sqrt{d(v_i)d(v_j)}} & \text{if $\{v_i, v_j\} \in E$ and $i \ne j$},\\ 0 & \text{otherwise}. \end{cases}\]

Random walk normalized Laplacian

\[L^{rw} = D^{-1}L = I - D^{-1}A.\] \[L_{ij}^{rw} = \begin{cases} 1 & \text{if $i = j$ and $d(v_i) \ne 0$},\\ -\frac{1}{d(v_i)} & \text{if $\{v_i, v_j\} \in E$ and $i \ne j$},\\ 0 & \text{otherwise}. \end{cases}\]

In the next blog we will explore more about Graph neural networks (GNN).