Back

Minisymposium Presentation

3S in Distributed Graph Neural Networks: Sparse Communication, Sampling, and Scalability

Tuesday, June 4, 2024
17:30
-
18:00
CEST
Climate, Weather and Earth Sciences
Climate, Weather and Earth Sciences
Climate, Weather and Earth Sciences
Chemistry and Materials
Chemistry and Materials
Chemistry and Materials
Computer Science and Applied Mathematics
Computer Science and Applied Mathematics
Computer Science and Applied Mathematics
Humanities and Social Sciences
Humanities and Social Sciences
Humanities and Social Sciences
Engineering
Engineering
Engineering
Life Sciences
Life Sciences
Life Sciences
Physics
Physics
Physics

Description

This talk will focus on distributed-memory parallel algorithms for graph neural network (GNN) training. We will first focus on utilizing sparse matrix primitives to parallelize mini-batch training based on node-wise and layer-wise sampling. Then, we will illustrate techniques that are based on sparsity-aware sparse matrix times dense matrix multiplication algorithms to accelerate both full-graph and mini-batch sampling based training.

Authors