RuntimeError: repeats can not be negative
Hi everyone,
I am currently using meta-llama/Meta-Llama-3-70B-Instruct model to zero-shot prompting on some data. But I received this weird error and do not know how to address this.
""......
model_outputs = self._forward(model_inputs, **forward_params)
File "/lib64/python3.9/site-packages/transformers/pipelines/text_generation.py", line 327, in _forward
generated_sequence = self.model.generate(input_ids=input_ids, attention_mask=attention_mask, **generate_kwargs)
File "/lib64/python3.9/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File "/lib64/python3.9/site-packages/transformers/generation/utils.py", line 1614, in generate
input_ids, model_kwargs = self._expand_inputs_for_generation(
File "/lib64/python3.9/site-packages/transformers/generation/utils.py", line 610, in _expand_inputs_for_generation
input_ids = input_ids.repeat_interleave(expand_size, dim=0)
RuntimeError: repeats can not be negative
""
I would appriciate any helps and insights.
Thank you in advance.