Offline inference for pre-trained models without safetensors is not working with transformers version 4.40.0

#27
by sukumarburra - opened

When I tried to run offline inference by using model Helsinki-NLP/opus-mt-en-fr saved on my hugging face cache with latest transformers version 4.40.0 , it is trying to reach huggingface.co over the network even if the model is available in hugging face cache. Here is the error:

    model = AutoModelForSeq2SeqLM.from_pretrained(model_path)
  File "/Users/sukumar.burra/.env/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 563, in from_pretrained
    return model_class.from_pretrained(
  File "/Users/sukumar.burra/.env/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3370, in from_pretrained
    if not has_file(pretrained_model_name_or_path, safe_weights_name, **has_file_kwargs):
  File "/Users/sukumar.burra/.env/lib/python3.10/site-packages/transformers/utils/hub.py", line 627, in has_file
    r = requests.head(url, headers=headers, allow_redirects=False, proxies=proxies, timeout=10)
  File "/Users/sukumar.burra/.env/lib/python3.10/site-packages/requests/api.py", line 100, in head
    return request("head", url, **kwargs)
  File "/Users/sukumar.burra/.env/lib/python3.10/site-packages/requests/api.py", line 59, in request
    return session.request(method=method, url=url, **kwargs)
  File "/Users/sukumar.burra/.env/lib/python3.10/site-packages/requests/sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
  File "/Users/sukumar.burra/.env/lib/python3.10/site-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
  File "/Users/sukumar.burra/.env/lib/python3.10/site-packages/requests/adapters.py", line 519, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /Helsinki-NLP/opus-mt-en-fr/resolve/main/model.safetensors (Caused by NameResolutionError("<urllib3.connection.HTTPSConnection object at 0x16f486740>: Failed to resolve 'huggingface.co' ([Errno 8] nodename nor servname provided, or not known)"))

Noticed the similar issue with another model cardiffnlp/twitter-roberta-base-sentiment-latest which does not have safetensors version.
I do not see this issue with previous versions of transformers. The version 4.39.3 still works.

I noticed, this issue is only observed with models which do not have safetensors version of the model in hugging face repo.
Is this expected or bug in latest transformers package ?

Sign up or log in to comment