I cannot import the model through Huggingface
I use the transformers version 4.45.0.dev0. After I use the code model = AutoModelForCausalLM.from_pretrained('Jiabin99/HiGPT'). It shows the error:Traceback (most recent call last):
File "", line 1, in
File "E:\Software\Anaconda\install\envs\py312\Lib\site-packages\transformers\models\auto\auto_factory.py", line 520, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\Software\Anaconda\install\envs\py312\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 1016, in from_pretrained
raise ValueError(
ValueError: The checkpoint you are trying to load has model type HeteroLlama
but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
model = AutoModelForCausalLM.from_pretrained("E:\Study\GNN\Code\HeteroBackdoorAttack\LLM\HiGPT")
:1: SyntaxWarning: invalid escape sequence '\S'
Traceback (most recent call last):
File "E:\Software\Anaconda\install\envs\py312\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 1014, in from_pretrained
config_class = CONFIG_MAPPING[config_dict["model_type"]]
~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\Software\Anaconda\install\envs\py312\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 716, in getitem
raise KeyError(key)
KeyError: 'HeteroLlama'
How to address that?