SlideShare a Scribd company logo
Weisfeiler and Leman Go Neural: Higher-order
Graph Neural Networks
Christopher Morris, Martin Ritzert, Matthias Fey, William L. Hamilton, Jan Eric
Lenssen, Gaurav Rattan, Martin Grohe
November 12, 2018
TU Dortmund University,
RWTH Aachen University,
McGill University
Motivation
Question
How similar are two graphs?
(a) Sildenafil (b) Vardenafil
1
High-level View: Supervised Graph Classification
2
High-level View: Supervised Graph Classification
⊆ H
φ: G → H
2
High-level View: Supervised Graph Classification
⊆ H
φ: G → H
2
Talk Structure
1 State-of-the-art methods for graph classification
2 Relationship between 1-WL kernel and Graph Neural Networks
3 Higher-order graph properties
4 Experimental results
3
Supervised Graph Classification: The State-of-the-Art
Kernel Methods
Find predefined substructures
and count them somehow:
• Shortest-paths or random
walks
• Motifs
• h-neighborhoods around
vertices
• Spectral Approaches
4
Supervised Graph Classification: The State-of-the-Art
Kernel Methods
Find predefined substructures
and count them somehow:
• Shortest-paths or random
walks
• Motifs
• h-neighborhoods around
vertices
• Spectral Approaches
Neural Methods
Parameterized neighborhood
aggregation function
f
(t)
v = 𝜎(W1 f
(t−1)
v + W2
∑︁
w∈N(v)
f
(t−1)
w )
and learn parameters W1 and W2
together with the parameters of
the classifier
4
Example: Weisfeiler-Lehman Subtree Kernel
Example (Weisfeiler-Lehman Subtree Kernel)
Graph kernel based on heuristic for graph isomorphism testing
Iteration: Two vertices get identical colors iff their colored
neighborhoods are identical
N. Shervashidze, P. Schweitzer, E. J. van Leeuwen, K. Mehlhorn, and K. M. Borgwardt.
“Weisfeiler-Lehman Graph Kernels”. In: JMLR 12 (2011), pp. 2539–2561
5
Example: Weisfeiler-Lehman Subtree Kernel
Example (Weisfeiler-Lehman Subtree Kernel)
Graph kernel based on heuristic for graph isomorphism testing
Iteration: Two vertices get identical colors iff their colored
neighborhoods are identical
𝜑(G1) = ( )
(a) G1
𝜑(G2) = ( )
(b) G2
N. Shervashidze, P. Schweitzer, E. J. van Leeuwen, K. Mehlhorn, and K. M. Borgwardt.
“Weisfeiler-Lehman Graph Kernels”. In: JMLR 12 (2011), pp. 2539–2561
5
Example: Weisfeiler-Lehman Subtree Kernel
Example (Weisfeiler-Lehman Subtree Kernel)
Graph kernel based on heuristic for graph isomorphism testing
Iteration: Two vertices get identical colors iff their colored
neighborhoods are identical
𝜑(G1) = (2, 2, 2, )
(a) G1
𝜑(G2) = (1, 1, 3, )
(b) G2
N. Shervashidze, P. Schweitzer, E. J. van Leeuwen, K. Mehlhorn, and K. M. Borgwardt.
“Weisfeiler-Lehman Graph Kernels”. In: JMLR 12 (2011), pp. 2539–2561
5
Example: Weisfeiler-Lehman Subtree Kernel
Example (Weisfeiler-Lehman Subtree Kernel)
Graph kernel based on heuristic for graph isomorphism testing
Iteration: Two vertices get identical colors iff their colored
neighborhoods are identical
𝜑(G1) = (2, 2, 2, 2, 2, 2, 0, 0)
(a) G1
𝜑(G2) = (1, 1, 3, 2, 0, 1, 1, 1)
(b) G2
N. Shervashidze, P. Schweitzer, E. J. van Leeuwen, K. Mehlhorn, and K. M. Borgwardt.
“Weisfeiler-Lehman Graph Kernels”. In: JMLR 12 (2011), pp. 2539–2561
5
Relationship between 1-WL and GNN
1-WL coloring
c(t)
(v) = hash
(︁
c(t−1)
(v), {{c(t−1)
(w) | w ∈ N(v)}}
)︁
6
Relationship between 1-WL and GNN
1-WL coloring
c(t)
(v) = hash
(︁
c(t−1)
(v), {{c(t−1)
(w) | w ∈ N(v)}}
)︁
General form of GNNs
h(t)
(v) = f
W
(t)
1
merge
(︁
h(t−1)
(v), f
W
(t)
2
aggr
(︀
{{h(t−1)
(w) | w ∈ N(v)}}
)︀)︁
6
Relationship between 1-WL and GNN
1-WL coloring
c(t)
(v) = hash
(︁
c(t−1)
(v), {{c(t−1)
(w) | w ∈ N(v)}}
)︁
General form of GNNs
h(t)
(v) = f
W
(t)
1
merge
(︁
h(t−1)
(v), f
W
(t)
2
aggr
(︀
{{h(t−1)
(w) | w ∈ N(v)}}
)︀)︁
Both methods aggregate colors/features of neighbors
6
Relationship between 1-WL and GNN
1-WL coloring
c(t)
(v) = hash
(︁
c(t−1)
(v), {{c(t−1)
(w) | w ∈ N(v)}}
)︁
General form of GNNs
h(t)
(v) = f
W
(t)
1
merge
(︁
h(t−1)
(v), f
W
(t)
2
aggr
(︀
{{h(t−1)
(w) | w ∈ N(v)}}
)︀)︁
Both methods aggregate colors/features of neighbors
Theorem (Informal)
GNNs cannot be more expressive than 1-WL in terms of
distinguishing non-isomorphic graphs.
6
Relationship between 1-WL and GNN
1-WL coloring
c(t)
(v) = hash
(︁
c(t−1)
(v), {{c(t−1)
(w) | w ∈ N(v)}}
)︁
General form of GNNs
h(t)
(v) = f
W
(t)
1
merge
(︁
h(t−1)
(v), f
W
(t)
2
aggr
(︀
{{h(t−1)
(w) | w ∈ N(v)}}
)︀)︁
7
Relationship between 1-WL and GNN
1-WL coloring
c(t)
(v) = hash
(︁
c(t−1)
(v), {{c(t−1)
(w) | w ∈ N(v)}}
)︁
General form of GNNs
h(t)
(v) = f
W
(t)
1
merge
(︁
h(t−1)
(v), f
W
(t)
2
aggr
(︀
{{h(t−1)
(w) | w ∈ N(v)}}
)︀)︁
Insight
GNNs are as powerful as 1-WL if f
W
(t)
1
merge and f
W
(t)
2
aggr are injective
Theorem (Informal)
There exists a GNN architecture and corresponding weights such
that it reaches an equivalent coloring as 1-WL.
7
Relationship between 1-WL and GNN
Theorem (Informal)
There exists a GNN architecture and corresponding weights such
that it reaches an equivalent coloring as 1-WL.
8
Relationship between 1-WL and GNN
Theorem (Informal)
There exists a GNN architecture and corresponding weights such
that it reaches an equivalent coloring as 1-WL.
1-WL GNN
∇
8
Relationship between 1-WL and GNN
Theorem (Informal)
There exists a GNN architecture and corresponding weights such
that it reaches an equivalent coloring as 1-WL.
1-WL GNN
∇
Take Away
GNNs have the same power as 1-WL in distinguishing
non-isomorphic graphs.
8
Relationship between 1-WL and GNN
Theorem (Informal)
There exists a GNN architecture and corresponding weights such
that it reaches an equivalent coloring as 1-WL.
1-WL GNN
∇
Take Away
GNNs have the same power as 1-WL in distinguishing
non-isomorphic graphs. Limits of 1-WL are well understood.
V. Arvind, J. Köbler, G. Rattan, and O. Verbitsky. “On the Power of Color
Refinement”. In: Symposium on Fundamentals of Computation
Theory. 2015, pp. 339–350
8
Limits of GNNs
Observation
GNNs cannot distinguish very basic graph properties, e.g.,
• Cycle-free vs. cyclic graphs
• Triangle counts
• Regular graphs
9
Limits of GNNs
Observation
GNNs cannot distinguish very basic graph properties, e.g.,
• Cycle-free vs. cyclic graphs
• Triangle counts
• Regular graphs
Observation
Higher-order graph properties play an important role for the
characterization of real-world networks.
9
Higher-order Graph Properties
Challenge
Incorporate more higher-order graph properties into Graph
Neural Networks.
10
Higher-order Graph Properties
Challenge
Incorporate more higher-order graph properties into Graph
Neural Networks.
1-WL GNN
k-WL k-GNN
∇
Global
Global
∇
10
Higher-order Graph Properties
Challenge
Incorporate more higher-order graph properties into Graph
Neural Networks.
1-WL GNN
k-WL k-GNN
∇
Global
Global
∇
Idea: k-WL
Color subgraphs instead of vertices, and define neighborhoods
between them.
10
k-dimensional Weisfeiler-Lehman
k-dimensional Weisfeiler-Lehman
• Colors vertex tuples from Vk
• Two tuples v, w are i-neighbors if vj = wj for all j ̸= i
v1 v2 v3
v4 v5 v6
11
k-dimensional Weisfeiler-Lehman
k-dimensional Weisfeiler-Lehman
• Colors vertex tuples from Vk
• Two tuples v, w are i-neighbors if vj = wj for all j ̸= i
v1 v2 v3
v4 v5 v6
11
k-dimensional Weisfeiler-Lehman
k-dimensional Weisfeiler-Lehman
• Colors vertex tuples from Vk
• Two tuples v, w are i-neighbors if vj = wj for all j ̸= i
v1 v2 v3
v4 v5 v6
Idea of the Algorithm
Initially Two tuples get the same color if the induced
subgraphs are isomorphic
11
k-dimensional Weisfeiler-Lehman
k-dimensional Weisfeiler-Lehman
• Colors vertex tuples from Vk
• Two tuples v, w are i-neighbors if vj = wj for all j ̸= i
v1 v2 v3
v4 v5 v6
Idea of the Algorithm
Initially Two tuples get the same color if the induced
subgraphs are isomorphic
Iteration Two tuples get same color iff they have the same
colored neighborhood
11
k-GNN
Idea
Derive k-dimensional Graph Neural Network
ft
S = 𝜎(W1ft−1
S + W2
∑︁
T∈N(S)
ft−1
T ),
where S is a subgraph of the input graph of size k.
12
k-GNN
Idea
Derive k-dimensional Graph Neural Network
ft
S = 𝜎(W1ft−1
S + W2
∑︁
T∈N(S)
ft−1
T ),
where S is a subgraph of the input graph of size k.
Challenges
• Scalability
• GPU memory consumption
12
Hierarchical k-GNN
Idea
Learn features for subgraphs in a hierarchical way
13
Hierarchical k-GNN
Idea
Learn features for subgraphs in a hierarchical way
1-GNN
. . .
2-GNN
. . .
3-GNN
. . .
MLP
Pool
Pool
Pool
Learning higher-order graph properties
13
Experimental Results
PRO IMDB-M NCI1 PTC-FM MUTAG
0
20
40
60
80
100
Accuracy[%]
1-GNN
1-k-GNN
Figure 3: Classification: Improvement on smaller benchmark datasets
14
Experimental Results
U0 ZPVE H
0.00
0.25
0.50
0.75
1.00
1.25
1.50
1.75
2.00
Errornormalizedto1-GNN(lowerisbetter)
1-k-GNN
1-GNN
MPNN
DTNN
Figure 4: Regression (QM9 data set): Gain over 1-GNN baseline
15
Conclusion
1 Relationship between 1-WL kernel and Graph Neural Networks
• GNN are a differentiable version of 1-WL
• GNNs and 1-WL are equally powerful
2 Higher-order graph embeddings
• k-dimensional GNN
• Hierarchical Variant
3 Experimental results
• Good results for large datasets with continuous node labels
16
Conclusion
1 Relationship between 1-WL kernel and Graph Neural Networks
• GNN are a differentiable version of 1-WL
• GNNs and 1-WL are equally powerful
2 Higher-order graph embeddings
• k-dimensional GNN
• Hierarchical Variant
3 Experimental results
• Good results for large datasets with continuous node labels
Collection of graph classification benchmarks
graphkernels.cs.tu-dortmund.de
16

More Related Content

PPTX
Graph Representation Learning
PDF
Graph neural networks overview
PDF
Glocalized Weisfeiler-Lehman Graph Kernels: Global-Local Feature Maps of Graphs
PPTX
A survey on graph kernels
PDF
Gnn overview
PDF
Neural networks for Graph Data NeurIPS2018読み会@PFN
PDF
Graph kernels
PPTX
Introduction to Graph Neural Networks: Basics and Applications - Katsuhiko Is...
Graph Representation Learning
Graph neural networks overview
Glocalized Weisfeiler-Lehman Graph Kernels: Global-Local Feature Maps of Graphs
A survey on graph kernels
Gnn overview
Neural networks for Graph Data NeurIPS2018読み会@PFN
Graph kernels
Introduction to Graph Neural Networks: Basics and Applications - Katsuhiko Is...

What's hot (20)

PPTX
Introduction to Graph neural networks @ Vienna Deep Learning meetup
PDF
Learning Convolutional Neural Networks for Graphs
PDF
【DL輪読会】Vision-Centric BEV Perception: A Survey
PDF
Graph Attention Network
PPTX
NS-CUK Joint Journal Club: S.T.Nguyen, Review on “Cluster-GCN: An Efficient A...
PPTX
Machine learning with graph
PPTX
Graph Neural Network (한국어)
PDF
VJAI Paper Reading#3-KDD2019-ClusterGCN
PPTX
有向グラフに対する 非線形ラプラシアンと ネットワーク解析
PPTX
Cikm 2018
PDF
Capsule Networks
PDF
論文紹介:Grad-CAM: Visual explanations from deep networks via gradient-based loca...
PPTX
モデルアーキテクチャ観点からの高速化2019
PPTX
[DL輪読会]Set Transformer: A Framework for Attention-based Permutation-Invariant...
PDF
グラフデータ分析 入門編
PDF
【論文読み会】Deep Clustering for Unsupervised Learning of Visual Features
PDF
Layer Normalization@NIPS+読み会・関西
PPTX
PDF
Graph Convolutional Neural Networks
PPTX
[DL輪読会]Revisiting Deep Learning Models for Tabular Data (NeurIPS 2021) 表形式デー...
Introduction to Graph neural networks @ Vienna Deep Learning meetup
Learning Convolutional Neural Networks for Graphs
【DL輪読会】Vision-Centric BEV Perception: A Survey
Graph Attention Network
NS-CUK Joint Journal Club: S.T.Nguyen, Review on “Cluster-GCN: An Efficient A...
Machine learning with graph
Graph Neural Network (한국어)
VJAI Paper Reading#3-KDD2019-ClusterGCN
有向グラフに対する 非線形ラプラシアンと ネットワーク解析
Cikm 2018
Capsule Networks
論文紹介:Grad-CAM: Visual explanations from deep networks via gradient-based loca...
モデルアーキテクチャ観点からの高速化2019
[DL輪読会]Set Transformer: A Framework for Attention-based Permutation-Invariant...
グラフデータ分析 入門編
【論文読み会】Deep Clustering for Unsupervised Learning of Visual Features
Layer Normalization@NIPS+読み会・関西
Graph Convolutional Neural Networks
[DL輪読会]Revisiting Deep Learning Models for Tabular Data (NeurIPS 2021) 表形式デー...
Ad

Similar to Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks (20)

PPTX
Weisfeiler and Leman Go Neural: Higher-order Graph Neural Network.pptx
PPTX
Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks, arXiv e-...
PPTX
NS-CUK Seminar: H.E.Lee, Review on "Weisfeiler and Leman Go Neural: Higher-O...
PPTX
NS-CUK Seminar: S.T.Nguyen, Review on "Improving Graph Neural Network Express...
PDF
Weisfeiler-Lehman by the Numbers
PPTX
240408_Thuy_Labseminar[Weisfeiler and Lehman Go Cellular: CW Networks+Weisfei...
PPTX
Colloquium.pptx
PDF
Grl book
PPTX
Chapter 3.pptx
PDF
Random Features Strengthen Graph Neural Networks
PPTX
Sun_MAPL_GNN.pptx
PPTX
NS-CUK Seminar:V.T.Hoang, Review on "Muhan Zhang, Pan Li: Nested Graph Neural...
PPTX
Chapter 4 better.pptx
PPTX
20191107 deeplearningapproachesfornetworks
PDF
Webinar on Graph Neural Networks
PDF
High-Performance Graph Analysis and Modeling
PDF
Graph Neural Network in practice
PPTX
NS-CUK Joint Journal Club: S.T.Nguyen, Review on "Neural Sheaf Diffusion: A ...
PPTX
NS-CUK Joint Journal Club: S.T.Nguyen, Review on "Neural Sheaf Diffusion: A T...
Weisfeiler and Leman Go Neural: Higher-order Graph Neural Network.pptx
Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks, arXiv e-...
NS-CUK Seminar: H.E.Lee, Review on "Weisfeiler and Leman Go Neural: Higher-O...
NS-CUK Seminar: S.T.Nguyen, Review on "Improving Graph Neural Network Express...
Weisfeiler-Lehman by the Numbers
240408_Thuy_Labseminar[Weisfeiler and Lehman Go Cellular: CW Networks+Weisfei...
Colloquium.pptx
Grl book
Chapter 3.pptx
Random Features Strengthen Graph Neural Networks
Sun_MAPL_GNN.pptx
NS-CUK Seminar:V.T.Hoang, Review on "Muhan Zhang, Pan Li: Nested Graph Neural...
Chapter 4 better.pptx
20191107 deeplearningapproachesfornetworks
Webinar on Graph Neural Networks
High-Performance Graph Analysis and Modeling
Graph Neural Network in practice
NS-CUK Joint Journal Club: S.T.Nguyen, Review on "Neural Sheaf Diffusion: A ...
NS-CUK Joint Journal Club: S.T.Nguyen, Review on "Neural Sheaf Diffusion: A T...
Ad

Recently uploaded (20)

PDF
ELS_Q1_Module-11_Formation-of-Rock-Layers_v2.pdf
PDF
AlphaEarth Foundations and the Satellite Embedding dataset
PPTX
neck nodes and dissection types and lymph nodes levels
PPTX
7. General Toxicologyfor clinical phrmacy.pptx
PPTX
INTRODUCTION TO EVS | Concept of sustainability
PPTX
ANEMIA WITH LEUKOPENIA MDS 07_25.pptx htggtftgt fredrctvg
PPTX
BIOMOLECULES PPT........................
PDF
bbec55_b34400a7914c42429908233dbd381773.pdf
PPTX
2. Earth - The Living Planet Module 2ELS
PDF
CAPERS-LRD-z9:AGas-enshroudedLittleRedDotHostingaBroad-lineActive GalacticNuc...
PDF
Phytochemical Investigation of Miliusa longipes.pdf
PDF
. Radiology Case Scenariosssssssssssssss
PPT
POSITIONING IN OPERATION THEATRE ROOM.ppt
PPTX
Comparative Structure of Integument in Vertebrates.pptx
PPTX
GEN. BIO 1 - CELL TYPES & CELL MODIFICATIONS
PDF
Sciences of Europe No 170 (2025)
PPTX
TOTAL hIP ARTHROPLASTY Presentation.pptx
PPTX
Cell Membrane: Structure, Composition & Functions
PDF
Biophysics 2.pdffffffffffffffffffffffffff
PPTX
ECG_Course_Presentation د.محمد صقران ppt
ELS_Q1_Module-11_Formation-of-Rock-Layers_v2.pdf
AlphaEarth Foundations and the Satellite Embedding dataset
neck nodes and dissection types and lymph nodes levels
7. General Toxicologyfor clinical phrmacy.pptx
INTRODUCTION TO EVS | Concept of sustainability
ANEMIA WITH LEUKOPENIA MDS 07_25.pptx htggtftgt fredrctvg
BIOMOLECULES PPT........................
bbec55_b34400a7914c42429908233dbd381773.pdf
2. Earth - The Living Planet Module 2ELS
CAPERS-LRD-z9:AGas-enshroudedLittleRedDotHostingaBroad-lineActive GalacticNuc...
Phytochemical Investigation of Miliusa longipes.pdf
. Radiology Case Scenariosssssssssssssss
POSITIONING IN OPERATION THEATRE ROOM.ppt
Comparative Structure of Integument in Vertebrates.pptx
GEN. BIO 1 - CELL TYPES & CELL MODIFICATIONS
Sciences of Europe No 170 (2025)
TOTAL hIP ARTHROPLASTY Presentation.pptx
Cell Membrane: Structure, Composition & Functions
Biophysics 2.pdffffffffffffffffffffffffff
ECG_Course_Presentation د.محمد صقران ppt

Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks

  • 1. Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks Christopher Morris, Martin Ritzert, Matthias Fey, William L. Hamilton, Jan Eric Lenssen, Gaurav Rattan, Martin Grohe November 12, 2018 TU Dortmund University, RWTH Aachen University, McGill University
  • 2. Motivation Question How similar are two graphs? (a) Sildenafil (b) Vardenafil 1
  • 3. High-level View: Supervised Graph Classification 2
  • 4. High-level View: Supervised Graph Classification ⊆ H φ: G → H 2
  • 5. High-level View: Supervised Graph Classification ⊆ H φ: G → H 2
  • 6. Talk Structure 1 State-of-the-art methods for graph classification 2 Relationship between 1-WL kernel and Graph Neural Networks 3 Higher-order graph properties 4 Experimental results 3
  • 7. Supervised Graph Classification: The State-of-the-Art Kernel Methods Find predefined substructures and count them somehow: • Shortest-paths or random walks • Motifs • h-neighborhoods around vertices • Spectral Approaches 4
  • 8. Supervised Graph Classification: The State-of-the-Art Kernel Methods Find predefined substructures and count them somehow: • Shortest-paths or random walks • Motifs • h-neighborhoods around vertices • Spectral Approaches Neural Methods Parameterized neighborhood aggregation function f (t) v = 𝜎(W1 f (t−1) v + W2 ∑︁ w∈N(v) f (t−1) w ) and learn parameters W1 and W2 together with the parameters of the classifier 4
  • 9. Example: Weisfeiler-Lehman Subtree Kernel Example (Weisfeiler-Lehman Subtree Kernel) Graph kernel based on heuristic for graph isomorphism testing Iteration: Two vertices get identical colors iff their colored neighborhoods are identical N. Shervashidze, P. Schweitzer, E. J. van Leeuwen, K. Mehlhorn, and K. M. Borgwardt. “Weisfeiler-Lehman Graph Kernels”. In: JMLR 12 (2011), pp. 2539–2561 5
  • 10. Example: Weisfeiler-Lehman Subtree Kernel Example (Weisfeiler-Lehman Subtree Kernel) Graph kernel based on heuristic for graph isomorphism testing Iteration: Two vertices get identical colors iff their colored neighborhoods are identical 𝜑(G1) = ( ) (a) G1 𝜑(G2) = ( ) (b) G2 N. Shervashidze, P. Schweitzer, E. J. van Leeuwen, K. Mehlhorn, and K. M. Borgwardt. “Weisfeiler-Lehman Graph Kernels”. In: JMLR 12 (2011), pp. 2539–2561 5
  • 11. Example: Weisfeiler-Lehman Subtree Kernel Example (Weisfeiler-Lehman Subtree Kernel) Graph kernel based on heuristic for graph isomorphism testing Iteration: Two vertices get identical colors iff their colored neighborhoods are identical 𝜑(G1) = (2, 2, 2, ) (a) G1 𝜑(G2) = (1, 1, 3, ) (b) G2 N. Shervashidze, P. Schweitzer, E. J. van Leeuwen, K. Mehlhorn, and K. M. Borgwardt. “Weisfeiler-Lehman Graph Kernels”. In: JMLR 12 (2011), pp. 2539–2561 5
  • 12. Example: Weisfeiler-Lehman Subtree Kernel Example (Weisfeiler-Lehman Subtree Kernel) Graph kernel based on heuristic for graph isomorphism testing Iteration: Two vertices get identical colors iff their colored neighborhoods are identical 𝜑(G1) = (2, 2, 2, 2, 2, 2, 0, 0) (a) G1 𝜑(G2) = (1, 1, 3, 2, 0, 1, 1, 1) (b) G2 N. Shervashidze, P. Schweitzer, E. J. van Leeuwen, K. Mehlhorn, and K. M. Borgwardt. “Weisfeiler-Lehman Graph Kernels”. In: JMLR 12 (2011), pp. 2539–2561 5
  • 13. Relationship between 1-WL and GNN 1-WL coloring c(t) (v) = hash (︁ c(t−1) (v), {{c(t−1) (w) | w ∈ N(v)}} )︁ 6
  • 14. Relationship between 1-WL and GNN 1-WL coloring c(t) (v) = hash (︁ c(t−1) (v), {{c(t−1) (w) | w ∈ N(v)}} )︁ General form of GNNs h(t) (v) = f W (t) 1 merge (︁ h(t−1) (v), f W (t) 2 aggr (︀ {{h(t−1) (w) | w ∈ N(v)}} )︀)︁ 6
  • 15. Relationship between 1-WL and GNN 1-WL coloring c(t) (v) = hash (︁ c(t−1) (v), {{c(t−1) (w) | w ∈ N(v)}} )︁ General form of GNNs h(t) (v) = f W (t) 1 merge (︁ h(t−1) (v), f W (t) 2 aggr (︀ {{h(t−1) (w) | w ∈ N(v)}} )︀)︁ Both methods aggregate colors/features of neighbors 6
  • 16. Relationship between 1-WL and GNN 1-WL coloring c(t) (v) = hash (︁ c(t−1) (v), {{c(t−1) (w) | w ∈ N(v)}} )︁ General form of GNNs h(t) (v) = f W (t) 1 merge (︁ h(t−1) (v), f W (t) 2 aggr (︀ {{h(t−1) (w) | w ∈ N(v)}} )︀)︁ Both methods aggregate colors/features of neighbors Theorem (Informal) GNNs cannot be more expressive than 1-WL in terms of distinguishing non-isomorphic graphs. 6
  • 17. Relationship between 1-WL and GNN 1-WL coloring c(t) (v) = hash (︁ c(t−1) (v), {{c(t−1) (w) | w ∈ N(v)}} )︁ General form of GNNs h(t) (v) = f W (t) 1 merge (︁ h(t−1) (v), f W (t) 2 aggr (︀ {{h(t−1) (w) | w ∈ N(v)}} )︀)︁ 7
  • 18. Relationship between 1-WL and GNN 1-WL coloring c(t) (v) = hash (︁ c(t−1) (v), {{c(t−1) (w) | w ∈ N(v)}} )︁ General form of GNNs h(t) (v) = f W (t) 1 merge (︁ h(t−1) (v), f W (t) 2 aggr (︀ {{h(t−1) (w) | w ∈ N(v)}} )︀)︁ Insight GNNs are as powerful as 1-WL if f W (t) 1 merge and f W (t) 2 aggr are injective Theorem (Informal) There exists a GNN architecture and corresponding weights such that it reaches an equivalent coloring as 1-WL. 7
  • 19. Relationship between 1-WL and GNN Theorem (Informal) There exists a GNN architecture and corresponding weights such that it reaches an equivalent coloring as 1-WL. 8
  • 20. Relationship between 1-WL and GNN Theorem (Informal) There exists a GNN architecture and corresponding weights such that it reaches an equivalent coloring as 1-WL. 1-WL GNN ∇ 8
  • 21. Relationship between 1-WL and GNN Theorem (Informal) There exists a GNN architecture and corresponding weights such that it reaches an equivalent coloring as 1-WL. 1-WL GNN ∇ Take Away GNNs have the same power as 1-WL in distinguishing non-isomorphic graphs. 8
  • 22. Relationship between 1-WL and GNN Theorem (Informal) There exists a GNN architecture and corresponding weights such that it reaches an equivalent coloring as 1-WL. 1-WL GNN ∇ Take Away GNNs have the same power as 1-WL in distinguishing non-isomorphic graphs. Limits of 1-WL are well understood. V. Arvind, J. Köbler, G. Rattan, and O. Verbitsky. “On the Power of Color Refinement”. In: Symposium on Fundamentals of Computation Theory. 2015, pp. 339–350 8
  • 23. Limits of GNNs Observation GNNs cannot distinguish very basic graph properties, e.g., • Cycle-free vs. cyclic graphs • Triangle counts • Regular graphs 9
  • 24. Limits of GNNs Observation GNNs cannot distinguish very basic graph properties, e.g., • Cycle-free vs. cyclic graphs • Triangle counts • Regular graphs Observation Higher-order graph properties play an important role for the characterization of real-world networks. 9
  • 25. Higher-order Graph Properties Challenge Incorporate more higher-order graph properties into Graph Neural Networks. 10
  • 26. Higher-order Graph Properties Challenge Incorporate more higher-order graph properties into Graph Neural Networks. 1-WL GNN k-WL k-GNN ∇ Global Global ∇ 10
  • 27. Higher-order Graph Properties Challenge Incorporate more higher-order graph properties into Graph Neural Networks. 1-WL GNN k-WL k-GNN ∇ Global Global ∇ Idea: k-WL Color subgraphs instead of vertices, and define neighborhoods between them. 10
  • 28. k-dimensional Weisfeiler-Lehman k-dimensional Weisfeiler-Lehman • Colors vertex tuples from Vk • Two tuples v, w are i-neighbors if vj = wj for all j ̸= i v1 v2 v3 v4 v5 v6 11
  • 29. k-dimensional Weisfeiler-Lehman k-dimensional Weisfeiler-Lehman • Colors vertex tuples from Vk • Two tuples v, w are i-neighbors if vj = wj for all j ̸= i v1 v2 v3 v4 v5 v6 11
  • 30. k-dimensional Weisfeiler-Lehman k-dimensional Weisfeiler-Lehman • Colors vertex tuples from Vk • Two tuples v, w are i-neighbors if vj = wj for all j ̸= i v1 v2 v3 v4 v5 v6 Idea of the Algorithm Initially Two tuples get the same color if the induced subgraphs are isomorphic 11
  • 31. k-dimensional Weisfeiler-Lehman k-dimensional Weisfeiler-Lehman • Colors vertex tuples from Vk • Two tuples v, w are i-neighbors if vj = wj for all j ̸= i v1 v2 v3 v4 v5 v6 Idea of the Algorithm Initially Two tuples get the same color if the induced subgraphs are isomorphic Iteration Two tuples get same color iff they have the same colored neighborhood 11
  • 32. k-GNN Idea Derive k-dimensional Graph Neural Network ft S = 𝜎(W1ft−1 S + W2 ∑︁ T∈N(S) ft−1 T ), where S is a subgraph of the input graph of size k. 12
  • 33. k-GNN Idea Derive k-dimensional Graph Neural Network ft S = 𝜎(W1ft−1 S + W2 ∑︁ T∈N(S) ft−1 T ), where S is a subgraph of the input graph of size k. Challenges • Scalability • GPU memory consumption 12
  • 34. Hierarchical k-GNN Idea Learn features for subgraphs in a hierarchical way 13
  • 35. Hierarchical k-GNN Idea Learn features for subgraphs in a hierarchical way 1-GNN . . . 2-GNN . . . 3-GNN . . . MLP Pool Pool Pool Learning higher-order graph properties 13
  • 36. Experimental Results PRO IMDB-M NCI1 PTC-FM MUTAG 0 20 40 60 80 100 Accuracy[%] 1-GNN 1-k-GNN Figure 3: Classification: Improvement on smaller benchmark datasets 14
  • 37. Experimental Results U0 ZPVE H 0.00 0.25 0.50 0.75 1.00 1.25 1.50 1.75 2.00 Errornormalizedto1-GNN(lowerisbetter) 1-k-GNN 1-GNN MPNN DTNN Figure 4: Regression (QM9 data set): Gain over 1-GNN baseline 15
  • 38. Conclusion 1 Relationship between 1-WL kernel and Graph Neural Networks • GNN are a differentiable version of 1-WL • GNNs and 1-WL are equally powerful 2 Higher-order graph embeddings • k-dimensional GNN • Hierarchical Variant 3 Experimental results • Good results for large datasets with continuous node labels 16
  • 39. Conclusion 1 Relationship between 1-WL kernel and Graph Neural Networks • GNN are a differentiable version of 1-WL • GNNs and 1-WL are equally powerful 2 Higher-order graph embeddings • k-dimensional GNN • Hierarchical Variant 3 Experimental results • Good results for large datasets with continuous node labels Collection of graph classification benchmarks graphkernels.cs.tu-dortmund.de 16