Papers
arxiv:2501.05441

The GAN is dead; long live the GAN! A Modern GAN Baseline

Published on Jan 9
Ā· Submitted by Skylion007 on Jan 10
#1 Paper of the day

Abstract

There is a widely-spread claim that GANs are difficult to train, and GAN architectures in the literature are littered with empirical tricks. We provide evidence against this claim and build a modern GAN baseline in a more principled manner. First, we derive a well-behaved regularized relativistic GAN loss that addresses issues of mode dropping and non-convergence that were previously tackled via a bag of ad-hoc tricks. We analyze our loss mathematically and prove that it admits local convergence guarantees, unlike most existing relativistic losses. Second, our new loss allows us to discard all ad-hoc tricks and replace outdated backbones used in common GANs with modern architectures. Using StyleGAN2 as an example, we present a roadmap of simplification and modernization that results in a new minimalist baseline -- R3GAN. Despite being simple, our approach surpasses StyleGAN2 on FFHQ, ImageNet, CIFAR, and Stacked MNIST datasets, and compares favorably against state-of-the-art GANs and diffusion models.

Community

Paper author Paper submitter

Can GANs beat diffusion in 2025? Yes!

This paper describes how to train GAN models more stably than prior work. The authors are able to modernize the ConvNet architecture of GANs while maintaining the performance and equilibrium of the model. This newer, more stable training strategy allows the GAN to train for longer, matching a similar number of training steps as a diffusion models. Once GANs are trained long enough, with a powerful enough architecture, they can outperform diffusion models, and serve as better, faster, smaller models.

We created a deep dive video for this paper: https://www.youtube.com/watch?v=tMJx2s8YsZs. Happy learningšŸ¤“ and stretchingšŸ’Ŗ together!āœØšŸø

Sign up or log in to comment

Models citing this paper 5

Browse 5 models citing this paper

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2501.05441 in a dataset README.md to link it from this page.

Spaces citing this paper 1

Collections including this paper 5