RuntimeError: The size of tensor a (24) must match the size of tensor b (128) at non-singleton dimension 3

#37
by Ink - opened

Code:
import torch
text = 'Hello, is it me you are looking for? '
tokens = tokenizer(text=text)
print(tokens)
input_toks = tokens['input_ids']

print(model(torch.tensor(input_toks)))

Output:
{'input_ids': [128000, 9906, 11, 374, 433, 757, 499, 527, 3411, 369, 30, 220], 'attention_mask': [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]}

File "Lib\site-packages\transformers\models\llama\modeling_llama.py", line 173, in apply_rotary_pos_emb
q_embed = (q * cos) + (rotate_half(q) * sin)
RuntimeError: The size of tensor a (24) must match the size of tensor b (128) at non-singleton dimension 3

Can someone help me sort out this error please?

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment