Spaces:
Running
on
CPU Upgrade
Model Evals Failed
Hi there, these two models I tried to eval yesterday failed, they're both Mixtral finetunes. I was able to load and quick eval them fine on runpod.
I think they want safetensors here, your models are bins.
First steps before submitting a model
- Make sure you can load your model and tokenizer using AutoClasses:
from transformers import AutoConfig, AutoModel, AutoTokenizer
config = AutoConfig.from_pretrained("your model name", revision=revision)
model = AutoModel.from_pretrained("your model name", revision=revision)
tokenizer = AutoTokenizer.from_pretrained("your model name", revision=revision)
If this step fails, follow the error messages to debug your model before submitting it. It's likely your model has been improperly uploaded.Note: make sure your model is public! Note: if your model needs use_remote_code=True, we do not support this option yet but we are working on adding it, stay posted!
- Convert your model weights to safetensors
It's a new format for storing weights which is safer and faster to load and use. It will also allow us to add the number of parameters of your model to the Extended Viewer!
I think they want safetensors here, your models are bins.
First steps before submitting a model
- Make sure you can load your model and tokenizer using AutoClasses:
from transformers import AutoConfig, AutoModel, AutoTokenizer
config = AutoConfig.from_pretrained("your model name", revision=revision)
model = AutoModel.from_pretrained("your model name", revision=revision)
tokenizer = AutoTokenizer.from_pretrained("your model name", revision=revision)
If this step fails, follow the error messages to debug your model before submitting it. It's likely your model has been improperly uploaded.Note: make sure your model is public! Note: if your model needs use_remote_code=True, we do not support this option yet but we are working on adding it, stay posted!
- Convert your model weights to safetensors
It's a new format for storing weights which is safer and faster to load and use. It will also allow us to add the number of parameters of your model to the Extended Viewer!
Huh, I missed that one crucial step. I was so used to axolotl outputting .safetensors I guess. Thanks a lot for pointing that out, ill convert them.
Edit: wait, looks like some of my successful evals were .bin though
Maybe they've changed it and no longer accept bins.
And you've also picked float16 on evaluation instead of bfloat16(at least your config says it's bfloat16, I personally don't know what you cooked up)
Hi, thanks for your issue!
It's not that we don't accept .bin
per se, it's just that we don't guarantee that they'll run, as it's easier for them to contain silent bugs then the safetensors format.
Both your models failed at the download step, feel free to ping me once your weights are converted and I'll re-run your jobs!
@clefourrier I see. I have now just recently converted both models to safetensors.
Perfect, thank you :)
Just added both your models back to pending!
Closing the issue, feel free to reopen if there's another issue at eval :)