The current model class (JAISModel) is not compatible with `.generate()`, as it doesn't have a language model head.
Hello Team,
I encountered an error when executing the sample code and would appreciate your assistance in resolving it.
The error trace is as follows:
- The error originates in the file "", line 1, within a module.
- It traces back to a function called
get_response_test
in the same file, line 5. - The issue seems to stem from the
/LLM/miniconda3/envs/jais-13b/lib/python3.10/site-packages/torch/utils/_contextlib.py
file, specifically line 115, in thedecorate_context
function. - Further, in the
generate
function of/LLM/miniconda3/envs/jais-13b/lib/python3.10/site-packages/transformers/generation/utils.py
at line 1210, there's a validation process for the model class. - The actual error is identified in line 1089 of the same file, in the
_validate_model_class
function. It throws aTypeError
, indicating that the current model class (JAISModel
) is incompatible with the.generate()
function, as it lacks a language model head.
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 5, in get_response_test
File "/LLM/miniconda3/envs/jais-13b/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/LLM/miniconda3/envs/jais-13b/lib/python3.10/site-packages/transformers/generation/utils.py", line 1210, in generate
self._validate_model_class()
File "/LLM/miniconda3/envs/jais-13b/lib/python3.10/site-packages/transformers/generation/utils.py", line 1089, in _validate_model_class
raise TypeError(exception_message)
TypeError: The current model class (JAISModel) is not compatible with
.generate(), as it doesn't have a language model head.
>>>
Looking forward to your guidance to fix this issue.
Thanks,
I am also facing the same issue, any fix for this?
@FK7 I was able to work with it by using AutoModelForCausalLM, the JAIS team are using it for language head with JAISLMHeadModel mapped to AutoModelForCausalLM, hope it works
I have the same issue when I tried to quantize the model.