Text2Text Generation
Transformers
PyTorch
Safetensors
English
t5
Inference Endpoints
text-generation-inference

Upload model.safetensors with huggingface_hub

#2
by jbochi - opened

This is equivalent to https://huggingface.co/grammarly/coedit-large/discussions/4

This new file is equivalent to pytorch_model.bin but safe in the sense that no arbitrary code can be put into it.

These files also happen to load much faster than their pytorch counterpart:
https://colab.research.google.com/github/huggingface/notebooks/blob/main/safetensors_doc/en/speed.ipynb

The model is too large for https://huggingface.co/spaces/safetensors/convert , so I created the file manually:

from transformers import T5ForConditionalGeneration
from safetensors.torch import save_model
model = T5ForConditionalGeneration.from_pretrained("grammarly/coedit-xxl")
save_model(model, "model.safetensors")
Grammarly org

Argh. This doesn't work because the file is missing metadata.

jbochi changed pull request status to closed

Sign up or log in to comment