OSError: Can't load tokenizer for 'E:\models--Vision-CAIR--vicuna-7b' but my local file exists

#2
by AnYi66 - opened

Because the model file is too large, I manually downloaded it and put it on the E drive. When I ran the MiniGPT-5 project on github, it told me that it could not be found. How should I solve this problem now?
Snipaste_2023-11-30_16-31-06.png

Sign up or log in to comment