ModuleNotFoundError: No module named 'transformers_modules.NousResearch.OLMo-Bitnet

#1
by fahdmirzac - opened

Hi,
While running it on Ubuntu , getting following error. I have built transformers from the source:

pip install git+https://github.com/huggingface/transformers

and my transformers version is Version: 4.40.0.dev0

and when I run following:
model = AutoModelForCausalLM.from_pretrained("NousResearch/OLMo-Bitnet-1B",
torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto")

I get following error:

File "/home/ubuntu/.cache/huggingface/modules/transformers_modules/NousResearch/OLMo-Bitnet-1B/9c9783f0983e51c6dfe84e22c054611ba4eae27f/optim.py", line 13, in
from .model import LayerNormBase, BitLinear158
ModuleNotFoundError: No module named 'transformers_modules.NousResearch.OLMo-Bitnet-1B.9c9783f0983e51c6dfe84e22c054611ba4eae27f.model'

Hi @fahdmirzac , have you resolved the problem yet?
I got the same error when trying the transformer (ver 4.38.2).
Here is the error:
File "OLMo-Bitnet-1B/optim.py", line 13, in
from .model import LayerNormBase, BitLinear158
ModuleNotFoundError: No module named 'transformers_modules.OLMo-Bitnet-1B.model

I found a way to bypass this, only need to copy file bị thiếu từ model directory into the cache path (somehow the model.py file is not included)

cd model
cp *.py /.cache/huggingface/modules/transformers_modules/OLMo-Bitnet-1B/

@namtran where exactly is this model directly. I don't see it on my Ubuntu system, and still getting the same error.

The model folder is the huggingface model folder after you install. In your case, I think you can try:
cd OLMo-Bitnet-1B
cp *.py /home/ubuntu/.cache/huggingface/modules/transformers_modules/NousResearch/OLMo-Bitnet-1B/9c9783f0983e51c6dfe84e22c054611ba4eae27f/

@namtran thanks for reply. Could you please share all the commands you are using so that I could reproduce?

Sign up or log in to comment