Issues loading this model in offline mode

#11
by trieudemo11 - opened

After saving the model with
model.save_pretrained("./")

I loaded it with
model = AutoModelForCausalLM.from_pretrained("./", local_files_only = True, device_map={"": 0}, torch_dtype=torch.float16, low_cpu_mem_usage = True)

I received this error
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like NousResearch/Hermes-2-Pro-Llama-3-8B is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

The config file is stored in the local folder. I haven't had this issue before with other models.

Shouldn't store adapter_config.json and config.json in the same folder.

trieudemo11 changed discussion status to closed

Sign up or log in to comment