Does anyone have the difficulties using it on Colab?

#112
by Logic-Quantum - opened

OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like meta-llama/Meta-Llama-3-8B-Instruct is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
I always got this error but it seems my token is correct. I am wondering why?
Much appreciated

Facing the same issue, did you find the solution?

osanseviero changed discussion status to closed

Sign up or log in to comment