Error when using falcon-7b model for embeddings

#25
by Shilpil - opened

Here is a sample code that I am using for embeddings:

from transformers import (
AutoTokenizer,
AutoModelForCausalLM,
AutoConfig)

config = AutoConfig.from_pretrained("tiiuae/falcon-7b", output_hidden_states=False, trust_remote_code=True)
#config.init_device = 'cuda:0'
tokenizer = AutoTokenizer.from_pretrained("tiiuae/falcon-7b")
model = AutoModelForCausalLM.from_pretrained("tiiuae/falcon-7b", device_map='auto',config=config, trust_remote_code=True)
print(model)

inputs = tokenizer(["Today is"], return_tensors="pt")

output = model(**inputs)

hidden_states = output[2]

embedding_output = hidden_states[0]

I am getting the following error:

Traceback (most recent call last):
File "/home/ubuntu/compliance/exp.py", line 15, in
output = model(**inputs)
File "/home/ubuntu/compliance/env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1194, in _call_impl
return forward_call(*input, **kwargs)
File "/home/ubuntu/compliance/env/lib/python3.10/site-packages/accelerate/hooks.py", line 165, in new_forward
output = old_forward(*args, **kwargs)
File "/home/ubuntu/.cache/huggingface/modules/transformers_modules/tiiuae/falcon-7b-instruct/8281cc02a66b7e15ad7afc18b16e9f683d8316c2/modelling_RW.py", line 749, in forward
raise ValueError(f"Got unexpected arguments: {deprecated_arguments}")
ValueError: Got unexpected arguments: {'token_type_ids': tensor([[0, 0]], device='cuda:0')}

Please help solve the error

Technology Innovation Institute org

See this discussion for a solution :)

FalconLLM changed discussion status to closed

Sign up or log in to comment