Please fix fp16 #15

by KenjieDec - opened

Please remove
"<<<<<<< HEAD ======= >>>>>>> b45bafccd9d0e0757b70a54c7ebc32ff56ca9ee1" from files such as model_index.json

and sorry for the previous PR, I don't know how to edit a branch or delete a PR

Oops, forgot about that. I'll remove it in a sec, thanks.

Hey! Thanks for editing it out from model_index.json!
But other json also seems to be having those texts too. Thank you!

@hakurei I've noticed this was fixed in the unet config at dc7f724, but the same issue is still present in vae/config.json.

i think it should be fixed now. git doesn't like me

@hakurei I can confirm the JSON files issue appears to be fixed. However, attempting to load the fp16 branch in Diffusers now results in the following error message. This also happens in the demo Colab.

OSError: You seem to have cloned a repository without having git-lfs installed. Please install git-lfs and run 'git lfs install' followed by 'git lfs pull' in the folder you cloned.

I did some brief investigation and it seems like pointer files have been added in place of the intended binary files on the fp16 branch. For instance, unet/diffusion_pytorch_model.bin is a 3.44 GB file in the main branch, but it is listed as having only 297 bytes in the fp16 one when it should be closer to 1.7 GB. In the main branch, attempting to manually download the .bin works as expected, but in the fp16 one, it results in a plaintext Git LFS pointer.

In any case, thanks a lot for your work on Waifu Diffusion!!

crap. Ill just redo this branch.

Honestly, the v1.3 update generally broke the repository for me -- and since I've been using an auto-download-and-setup sheet on Colab that I can't really edit, it's cost me the ability to use Waifu Diffusion in general. I was in the middle of a project, too (guided recursive image-to-image), and v1.3 is an overfit model for my purposes -- all of which I could easily fix if there was a full backup of the v1.2 fp16 model somewhere on Huggingface.

Or the ability to branch from old versions. Don't see that anywhere in the interface.

hopefully it works now, let me know if something goes wrong