Week 13: Graph Neural Networks Flashcards
(6 cards)
Shortly describe what is a GNN?
A GNN is an optimizable transformation on all attributes of the graph (nodes, edges, global-context) that preserves graph symmetries (permutation invariances)
What are the three main types of prediction tasks in GNNs?
Graph-level: Predict properties of entire graph (e.g., “Does molecule smell?”)
Node-level: Predict properties of individual nodes (e.g., Karate club allegiances)
Edge-level: Predict properties of edges or edge existence (e.g., relationship classification)
What are the three geometric priors of the GDL blueprint?
A:
Symmetry (Invariance/Equivariance): Output should be consistent under transformations
Geometric Stability: Stability to signal and domain deformations
Scale Separation: Multi-scale representation capability
Explain the mechanism of message passing in graph neural networks.
A: Message passing works in three steps:
Gather: Collect neighboring node embeddings (messages) for each node
Aggregate: Combine messages using aggregation function (sum, mean, max)
Update: Pass aggregated messages through learned neural network to update node representations
What are the two main approaches to graph convolution?
A:
Spectral-based:
Filter signals using eigendecompositions of graph Laplacian
Transform to spectral domain, filter, then inverse transform
Removes noise from graph signals
Spatial-based:
Filter by information propagation distance
Aggregate based on neighborhood structure
Direct aggregation of neighboring features
What made AlphaFold2 revolutionary?
A:
Solved 50-year protein folding challenge
Achieved breakthrough performance on CASP competition
Created comprehensive protein structure database
Won 2024 Nobel Prize in Chemistry
Enables drug discovery, understanding life processes