Custom models failing for tokenizer error

#87
by lloydchristmas1231 - opened

I have been using several custom models by modifying the app.py file and adding new models to swap_base_model, and associated controls (base_model_to_use dropdown and pointing to my repo id in the if(is_gpu_associated): block). This was working for a few days. However, when trying the same action for a new model, I started receiving the following error:

Can't load tokenizer for '/home/user/.cache/huggingface/hub/models--lloydchristmas1231--mymodel/snapshots/376a82eb10d63bfd939865bb5f681a66cba505e1'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure '/home/user/.cache/huggingface/hub/models--lloydchristmas1231--mymodel/snapshots/376a82eb10d63bfd939865bb5f681a66cba505e1' is the correct path to a directory containing all relevant files for a CLIPTokenizer tokenizer.

I'm simply uploading .safetensors files to a model repo (which was working). Is there something else I'm missing? Has something in the app changed?

This only happens when I try to use or include a custom model. The app still works for the standard 3.

Sign up or log in to comment