Diffusers
Safetensors

Running Flux.1-dev under 12GBs

This repository contains the NF4 params for the T5 and transformer of Flux.1-Dev. Check out this Colab Notebook for details on how they were obtained.

Check out this notebook that shows how to use the checkpoints and run in a free-tier Colab Notebook.

Respective diffusers PR: https://github.com/huggingface/diffusers/pull/9213/.

The checkpoints of this repository were optimized to run on a T4 notebook. More specifically, the compute datatype of the quantized checkpoints was kept to FP16. In practice, if you have a GPU card that supports BF16, you should change the compute datatype to BF16 (bnb_4bit_compute_dtype).

Downloads last month
0
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Space using hf-internal-testing/flux.1-dev-nf4-pkg 1