File size: 2,622 Bytes
ea4902a
 
a9ca2d6
ea4902a
 
a9ca2d6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
---
license: mit
pipeline_tag: unconditional-image-generation
---

# FreeFlow: Flow Map Distillation Without Data

This repository contains the official PyTorch implementation for the paper:
[**Flow Map Distillation Without Data**](https://huggingface.co/papers/2511.19428)

**[Project Page](https://data-free-flow-distill.github.io/)** | **[GitHub Repository](https://github.com/ShangyuanTong/FreeFlow)**

State-of-the-art flow models achieve remarkable quality but require slow, iterative sampling. FreeFlow explores a data-free alternative to flow map distillation, which conventionally requires external datasets. By sampling only from the prior distribution, our method circumvents the risk of Teacher-Data Mismatch. It learns to predict the teacher's sampling path while actively correcting for its own compounding errors, achieving state-of-the-art performance. Specifically, it reaches an impressive FID of **1.45** on ImageNet 256x256, and **1.49** on ImageNet 512x512, both with only 1 sampling step. We hope this work establishes a more robust paradigm for accelerating generative models.

![FreeFlow samples](https://github.com/ShangyuanTong/FreeFlow/raw/main/assets/visual_teaser.jpeg)

## Usage

### Setup

We provide an [`environment.yml`](https://github.com/ShangyuanTong/FreeFlow/blob/main/environment.yml) file that can be used to create a Conda environment. If you only want to run pre-trained models locally on CPU, you can remove the `cudatoolkit` and `pytorch-cuda` requirements from the file.

```bash
conda env create -f environment.yml
conda activate DiT
```

### Sampling

Pre-trained FreeFlow checkpoints are hosted on the [Hugging Face organization page](https://huggingface.co/nyu-visionx/FreeFlow/tree/main). You can sample from our pre-trained models with [`sample.py`](https://github.com/ShangyuanTong/FreeFlow/blob/main/sample.py). To use them, visit the Hugging Face download [guide](https://huggingface.co/docs/huggingface_hub/en/guides/download), and pass the file path to the script, as shown below.

The script allows switching between the 256x256 and 512x512 models and changing the classifier-free guidance scale, etc. For example, to sample from our 512x512 FreeFlow-XL/2 model, you can use:

```bash
python sample.py --image-size 512 --seed 1 --ckpt <ckpt-path>
```

## Citation

If you find our work helpful or inspiring, please feel free to cite it:

```bibtex
@article{tong2025freeflow,
  title={Flow Map Distillation Without Data},
  author={Tong, Shangyuan and Ma, Nanye and Xie, Saining and Jaakkola, Tommi},
  year={2025},
  journal={arXiv preprint arXiv:2511.19428},
}
```