lllyasviel/flux1-dev-bnb-nf4 with Python

#26
by elenopes - opened

I try to go the
MODEL: lllyasviel/flux1-dev-bnb-nf4 from https://huggingface.co/lllyasviel/flux1-dev-bnb-nf4
and all the components:
VAE: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/ae.safetensors
ENCODER: https://huggingface.co/comfyanonymous/flux_text_encoders/blob/main/t5xxl_fp8_e4m3fn.safetensors
CLIP: https://huggingface.co/comfyanonymous/flux_text_encoders/blob/main/clip_l.safetensors

My repo now:

flux/
β”œβ”€β”€ ae.safetensors
β”œβ”€β”€ flux1-dev-bnb-nf4.safetensors
β”œβ”€β”€ model_index.json   //made by me
β”œβ”€β”€ tokenizer/
β”‚   └── tokenizer.json   //from https://huggingface.co/black-forest-labs/FLUX.1-dev/tree/main/tokenizer_2
β”œβ”€β”€ t5xxl_fp8/
 |   β”œβ”€β”€ config.json       //from https://huggingface.co/black-forest-labs/FLUX.1-dev/tree/main/transformer
β”‚   └── model.safetensors
β”œβ”€β”€ vae/                       
β”‚   β”œβ”€β”€ config.json       //from here https://huggingface.co/black-forest-labs/FLUX.1-dev/tree/main/vae
β”‚   └── diffusion_pytorch_model.safetensors
└── clip_l/
    β”œβ”€β”€ config.json
    └── model.safetensors

my code:

import os
import torch
from diffusers import AutoencoderKL, FluxTransformer2DModel, FluxPipeline
from transformers import CLIPTextModel, PreTrainedTokenizerFast, T5ForConditionalGeneration,logging

#print(os.path.exists(diffusion_model_path))  # Π”ΠΎΠ»ΠΆΠ½ΠΎ вывСсти True, Ссли Ρ„Π°ΠΉΠ» сущСствуСт
#logging.set_verbosity(logging.DEBUG)
torch.cuda.empty_cache() 
torch.device('cuda')

#Components
print('vae')
vae = AutoencoderKL.from_pretrained("./flux/vae")

print('clip_l')
text_encoder = CLIPTextModel.from_pretrained("./flux/clip_l")

print('tokenizer')
tokenizer = PreTrainedTokenizerFast.from_pretrained("./flux/tokenizer")

print('t5xxl_fp8')
t5_model = T5ForConditionalGeneration.from_pretrained("./flux/t5xxl_fp8")

print('flux1-dev')
transformer = FluxTransformer2DModel.from_pretrained("./flux/flux1-dev-bnb-nf4.safetensors")

#Model
print('Create the pipeline with the loaded models...')
model = FluxPipeline(vae=vae, text_encoder=text_encoder, tokenizer=tokenizer, transformer=transformer, t5_model=t5_model)

The error:

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
Cell In[11], line 22
     19 tokenizer = PreTrainedTokenizerFast.from_pretrained("./flux/tokenizer")
     21 print('t5xxl_fp8')
---> 22 t5_model = T5ForConditionalGeneration.from_pretrained("./flux/t5xxl_fp8")
     24 print('flux1-dev')
     25 transformer = FluxTransformer2DModel.from_pretrained("./flux/flux1-dev-bnb-nf4.safetensors")

File ~/.local/lib/python3.12/site-packages/transformers/modeling_utils.py:3792, in PreTrainedModel.from_pretrained(cls, pretrained_model_name_or_path, config, cache_dir, ignore_mismatched_sizes, force_download, local_files_only, token, revision, use_safetensors, *model_args, **kwargs)
   3789 with safe_open(resolved_archive_file, framework="pt") as f:
   3790     metadata = f.metadata()
-> 3792 if metadata.get("format") == "pt":
   3793     pass
   3794 elif metadata.get("format") == "tf":

AttributeError: 'NoneType' object has no attribute 'get'

Maybe somebody knows my mistake?
P.S. Im really bad with config.files

Thanks!!!!

Im not so good now, but I really want to continue to go with models and etc

Sign up or log in to comment