Error | No such file or directory

#1
by MaZeNsMz - opened

I receive this error: No such file or directory
image.png

Baichuan Intelligent Technology org

You should make sure if the *.bin file is existed, and its md5 is right.

It is not downloaded, this happened when I was trying to download it.

In the H2oGPT, usually I add any HF Llm and it is getting it automatically.

I already downloaded the non-4bit variant of this model successfully, but this one didn't.

Regards...

Hi I've got the same error (No such file or directory) while doing:

from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("baichuan-inc/Baichuan2-13B-Chat-4bits", trust_remote_code=True)

But git clone works well.

Hi I've got the same error (No such file or directory) while doing:

from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("baichuan-inc/Baichuan2-13B-Chat-4bits", trust_remote_code=True)

But git clone works well.

I didn't try git clone and I am not sure how to move the download to the HF .Cache folder

this is a bug of the modeling python code, you can try have a symbolic link from the cached to your local directory, so you no need to download again, if you downloaded before:

from huggingface_hub import snapshot_download

snapshot_download(repo_id="baichuan-inc/Baichuan2-13B-Chat-4bits", local_dir="/your/localdir/Baichuan2-13B-Chat-4bits")

but after it, you still not able to run it, coz there another bug in the modeling_baichuan.py file...

but after it, you still not able to run it, coz there another bug in the modeling_baichuan.py file...

so many bugs, not working

Baichuan Intelligent Technology org

you can try:
model = AutoModelForCausalLM.from_pretrained("baichuan-inc/Baichuan2-13B-Chat-4bits", trust_remote_code=True, device_map='auto')
so, 4bit model will run on GPUs. It cannnot run on CPU because bitsandbytes do not support 4bit-cpu-run.

the same error, run on google colab
No such file or directory: 'baichuan-inc/Baichuan2-7B-Chat-4bits/pytorch_model.bin'

Baichuan Intelligent Technology org

the same error, run on google colab
No such file or directory: 'baichuan-inc/Baichuan2-7B-Chat-4bits/pytorch_model.bin'

maybe it is other errors cause it?

Baichuan Intelligent Technology org

can you post your code and md5 of baichuan-inc/Baichuan2-7B-Chat-4bits/pytorch_model.bin?

ModuleNotFoundError: No module named 'transformers_modules.baichuan-inc/Baichuan2-13B-Chat-4bits'
so many bugs

Baichuan Intelligent Technology org

ModuleNotFoundError: No module named 'transformers_modules.baichuan-inc/Baichuan2-13B-Chat-4bits'
so many bugs

You can post your code.

The model cannot be loaded successfully, with the error message "pytorch_model.bin" cannot be found.

With using "snapshot_download" first and then load the model, the process is fine and no error! Thanks!

But the 7B-chat-4bits model may not be good in logical inference. For example:
'''
台风「小犬」已减弱为强烈热带风暴,天文台在今日(9日)上午11时40分改发3号强风信号,取代昨日(8日)晚上11时50分发出的8号东北烈风或暴风信号,预料本港平均风速每小时41至62公里。天文台表示,3号风球将维持一段时间,当小犬对本港的威胁进一步减低时,天文台会改发1号戒备信号,或取消所有热带气旋警告信号。另外,红色暴雨警告信号现正生效。
在正午12时,小犬集结在香港之西南偏西约160公里,预料向西南偏西移动,时速约12公里,横过广东西部沿岸。在过去一小时,横澜岛录得的最高持续风速为每小时69公里,最高阵风超过每小时82公里。
另外,天文台在今日凌晨1时55分由黄色暴雨警告改发红色暴雨警告,至清晨4时再改发黑色暴雨警告,并在早上10时30分再度发出红雨警告。天文台指,小犬继续远离香港,本港风力亦逐步减弱,但其相关的雨带仍会为本港带来狂风骤雨及雷暴,雨势有时颇大。
自午夜起,本港大部分地区录得超过150毫米雨量,而黄大仙、观塘、港岛及大屿山部分地区的雨量更超过300毫米。另外,今日凌晨2时天文台发出山泥倾泻警告,凌晨3时15分又发出新界北部水浸特别报告,指新界北部,尤其是八乡及锦田一带,正受大雨影响;在过去3小时,该区录得超过90毫米雨量。雷暴警告现正生效。

根据上文, 甚么地方没有下雨?'''

The model reply is: 根据上文,黄大仙、观塘、港岛和大屿山的部分地区没有下雨。

This is totally wrong. When I tested it with other 7B models, say ChatGLM2, they can reply with reasonable answers!

On the other hand, the 13B chat 4bits model version gives excellent replies. So if you have 16GB VRAM, 13B chat 4bits is a good choice.

Baichuan Intelligent Technology org

The model cannot be loaded successfully, with the error message "pytorch_model.bin" cannot be found.

With using "snapshot_download" first and then load the model, the process is fine and no error! Thanks!

But the 7B-chat-4bits model may not be good in logical inference. For example:
'''
台风「小犬」已减弱为强烈热带风暴,天文台在今日(9日)上午11时40分改发3号强风信号,取代昨日(8日)晚上11时50分发出的8号东北烈风或暴风信号,预料本港平均风速每小时41至62公里。天文台表示,3号风球将维持一段时间,当小犬对本港的威胁进一步减低时,天文台会改发1号戒备信号,或取消所有热带气旋警告信号。另外,红色暴雨警告信号现正生效。
在正午12时,小犬集结在香港之西南偏西约160公里,预料向西南偏西移动,时速约12公里,横过广东西部沿岸。在过去一小时,横澜岛录得的最高持续风速为每小时69公里,最高阵风超过每小时82公里。
另外,天文台在今日凌晨1时55分由黄色暴雨警告改发红色暴雨警告,至清晨4时再改发黑色暴雨警告,并在早上10时30分再度发出红雨警告。天文台指,小犬继续远离香港,本港风力亦逐步减弱,但其相关的雨带仍会为本港带来狂风骤雨及雷暴,雨势有时颇大。
自午夜起,本港大部分地区录得超过150毫米雨量,而黄大仙、观塘、港岛及大屿山部分地区的雨量更超过300毫米。另外,今日凌晨2时天文台发出山泥倾泻警告,凌晨3时15分又发出新界北部水浸特别报告,指新界北部,尤其是八乡及锦田一带,正受大雨影响;在过去3小时,该区录得超过90毫米雨量。雷暴警告现正生效。

根据上文, 甚么地方没有下雨?'''

The model reply is: 根据上文,黄大仙、观塘、港岛和大屿山的部分地区没有下雨。

This is totally wrong. When I tested it with other 7B models, say ChatGLM2, they can reply with reasonable answers!

能确认一下你的pytorch_model.bin的md5吗?

The model cannot be loaded successfully, with the error message "pytorch_model.bin" cannot be found.

With using "snapshot_download" first and then load the model, the process is fine and no error! Thanks!

But the 7B-chat-4bits model may not be good in logical inference. For example:
'''
台风「小犬」已减弱为强烈热带风暴,天文台在今日(9日)上午11时40分改发3号强风信号,取代昨日(8日)晚上11时50分发出的8号东北烈风或暴风信号,预料本港平均风速每小时41至62公里。天文台表示,3号风球将维持一段时间,当小犬对本港的威胁进一步减低时,天文台会改发1号戒备信号,或取消所有热带气旋警告信号。另外,红色暴雨警告信号现正生效。
在正午12时,小犬集结在香港之西南偏西约160公里,预料向西南偏西移动,时速约12公里,横过广东西部沿岸。在过去一小时,横澜岛录得的最高持续风速为每小时69公里,最高阵风超过每小时82公里。
另外,天文台在今日凌晨1时55分由黄色暴雨警告改发红色暴雨警告,至清晨4时再改发黑色暴雨警告,并在早上10时30分再度发出红雨警告。天文台指,小犬继续远离香港,本港风力亦逐步减弱,但其相关的雨带仍会为本港带来狂风骤雨及雷暴,雨势有时颇大。
自午夜起,本港大部分地区录得超过150毫米雨量,而黄大仙、观塘、港岛及大屿山部分地区的雨量更超过300毫米。另外,今日凌晨2时天文台发出山泥倾泻警告,凌晨3时15分又发出新界北部水浸特别报告,指新界北部,尤其是八乡及锦田一带,正受大雨影响;在过去3小时,该区录得超过90毫米雨量。雷暴警告现正生效。

根据上文, 甚么地方没有下雨?'''

The model reply is: 根据上文,黄大仙、观塘、港岛和大屿山的部分地区没有下雨。

This is totally wrong. When I tested it with other 7B models, say ChatGLM2, they can reply with reasonable answers!

能确认一下你的pytorch_model.bin的md5吗?

MD5 of pytorch_model.bin of "7B chat 4bits " is 04a1d7e10c33680dcf30d23befa6314f
MD5 of pytorch_model.bin of "13B chat 4bits" is b3ac678cf5bd552f4f12843d542319cd

嗯,md5没问题,你确认一下测试的时候是否用的chat接口,如果不是,可能需要添加"<reserved_106>"和"<reserved_107>",例如:"<reserved_106>周杰伦第一张专辑叫什么<reserved_107>"

The model cannot be loaded successfully, with the error message "pytorch_model.bin" cannot be found.

With using "snapshot_download" first and then load the model, the process is fine and no error! Thanks!

But the 7B-chat-4bits model may not be good in logical inference. For example:
'''
台风「小犬」已减弱为强烈热带风暴,天文台在今日(9日)上午11时40分改发3号强风信号,取代昨日(8日)晚上11时50分发出的8号东北烈风或暴风信号,预料本港平均风速每小时41至62公里。天文台表示,3号风球将维持一段时间,当小犬对本港的威胁进一步减低时,天文台会改发1号戒备信号,或取消所有热带气旋警告信号。另外,红色暴雨警告信号现正生效。
在正午12时,小犬集结在香港之西南偏西约160公里,预料向西南偏西移动,时速约12公里,横过广东西部沿岸。在过去一小时,横澜岛录得的最高持续风速为每小时69公里,最高阵风超过每小时82公里。
另外,天文台在今日凌晨1时55分由黄色暴雨警告改发红色暴雨警告,至清晨4时再改发黑色暴雨警告,并在早上10时30分再度发出红雨警告。天文台指,小犬继续远离香港,本港风力亦逐步减弱,但其相关的雨带仍会为本港带来狂风骤雨及雷暴,雨势有时颇大。
自午夜起,本港大部分地区录得超过150毫米雨量,而黄大仙、观塘、港岛及大屿山部分地区的雨量更超过300毫米。另外,今日凌晨2时天文台发出山泥倾泻警告,凌晨3时15分又发出新界北部水浸特别报告,指新界北部,尤其是八乡及锦田一带,正受大雨影响;在过去3小时,该区录得超过90毫米雨量。雷暴警告现正生效。

根据上文, 甚么地方没有下雨?'''

The model reply is: 根据上文,黄大仙、观塘、港岛和大屿山的部分地区没有下雨。

This is totally wrong. When I tested it with other 7B models, say ChatGLM2, they can reply with reasonable answers!

能确认一下你的pytorch_model.bin的md5吗?

MD5 of pytorch_model.bin of "7B chat 4bits " is 04a1d7e10c33680dcf30d23befa6314f
MD5 of pytorch_model.bin of "13B chat 4bits" is b3ac678cf5bd552f4f12843d542319cd

嗯,md5没问题,你确认一下测试的时候是否用的chat接口,如果不是,可能需要添加"<reserved_106>"和"<reserved_107>",例如:"<reserved_106>周杰伦第一张专辑叫什么<reserved_107>"

Thank you! I think the not good answer is due to the size of model. 7B is too small! When I tested it with your 13B int4 chat model, WOW! the answers to my questions are very good. The model behaves totally different! It's just surprisingly good! Really A Great Model!
Untitled.png

I receive this error: No such file or directory
image.png

I solve this problem by adding:
```
from huggingface_hub import snapshot_download

snapshot_download(repo_id="baichuan-inc/Baichuan2-13B-Chat-4bits", local_dir="baichuan-inc/Baichuan2-13B-Chat-4bits")
``` 

我用oobabooga加载报错,请问这个该如何解决呢?

屏幕截图 2023-10-20 233354.png

Sign up or log in to comment