Towards Efficient Temporal Graph Learning: Algorithms, Frameworks, and Tools

Towards Efficient Temporal Graph Learning: Algorithms, Frameworks, and Tools

Oct 21, 2024·
Dachun Sun (孙达春)
Dachun Sun (孙达春)
· 1 min read
Abstract
Temporal graphs capture dynamic node relations via temporal edges, finding extensive utility in wide domains where time-varying patterns are crucial. Temporal Graph Neural Networks (TGNNs) have gained significant attention for their effectiveness in representing temporal graphs. However, TGNNs still face significant efficiency challenges in real-world low-resource settings. First, from a data-efficiency standpoint, training TGNNs requires sufficient temporal edges and data labels, which is problematic in practical scenarios with limited data collection and annotation. Second, from a resource-efficiency perspective, TGNN training and inference are computationally demanding due to complex encoding operations, especially on large-scale temporal graphs. Minimizing resource consumption while preserving effectiveness is essential. Inspired by these efficiency challenges, this tutorial systematically introduces state-of-the-art data-efficient and resource-efficient TGNNs, focusing on algorithms, frameworks, and tools, and discusses promising yet under-explored research directions in efficient temporal graph learning. This tutorial aims to benefit researchers and practitioners in data mining, machine learning, and artificial intelligence.
Date
Oct 21, 2024 1:45 PM — 5:30 PM
Event
Location

120C, Boise Centre

Boise, Idaho

Dachun Sun (孙达春)
Authors
Ph.D. Candidate