The field of temporal graph neural networks gains a new contender, Todyformer, in ‘Todyformer: Towards Holistic Dynamic Graph Transformers with Structure-Aware Tokenization’. Todyformer aims to tackle the limitations of existing architectures, namely over-squashing and over-smoothing, by introducing a new transformer-based neural network for dynamic graphs.
Todyformer represents a significant leap forward for dynamic graph modeling, combining local encoding strengths of MPNNs with global encoding prowess of Transformers to capture extensive temporal dependencies. The results affirm the benefits of the proposed intertwined approach in enhancing performance for downstream tasks.