file not found error

#25
by pradeepmohans - opened

when I run this:

from transformers import T5Tokenizer, T5ForConditionalGeneration

tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-xxl")
model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-xxl")

input_text = "translate English to German: How old are you?"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids

outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))

I get the following error:


EntryNotFoundError Traceback (most recent call last)
/usr/local/lib/python3.8/dist-packages/transformers/modeling_utils.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
1357 # Load from URL or cache if already cached
-> 1358 resolved_archive_file = cached_path(
1359 archive_file,

4 frames
EntryNotFoundError: 404 Client Error: Entry Not Found for url: https://huggingface.co/google/flan-t5-xxl/resolve/main/pytorch_model.bin

During handling of the above exception, another exception occurred:

OSError Traceback (most recent call last)
/usr/local/lib/python3.8/dist-packages/transformers/modeling_utils.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
1401 )
1402 else:
-> 1403 raise EnvironmentError(
1404 f"{pretrained_model_name_or_path} does not appear to have a file named {WEIGHTS_NAME}, "
1405 f"{TF2_WEIGHTS_NAME}, {TF_WEIGHTS_NAME} or {FLAX_WEIGHTS_NAME}."

OSError: google/flan-t5-xxl does not appear to have a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.

Hi @pradeepmohans
Thanks for the issue,
please use transformers>=4.18.0 in order to load large models that saves sharded checkpoints
Thanks!

Sign up or log in to comment