Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees Paper • 2110.03313 • Published Oct 7, 2021 • 1
SWARM Parallelism: Training Large Models Can Be Surprisingly Communication-Efficient Paper • 2301.11913 • Published Jan 27, 2023 • 1
A critical look at the evaluation of GNNs under heterophily: Are we really making progress? Paper • 2302.11640 • Published Feb 22, 2023
Distributed Inference and Fine-tuning of Large Language Models Over The Internet Paper • 2312.08361 • Published Dec 13, 2023 • 25