Model Unable to load

#1
by varun500 - opened

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("RWKV/rwkv-4-169m-pile")

model = AutoModelForCausalLM.from_pretrained("RWKV/rwkv-4-169m-pile")

File /opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py:643, in _LazyConfigMapping.getitem(self, key)
641 return self._extra_content[key]
642 if key not in self._mapping:
--> 643 raise KeyError(key)
644 value = self._mapping[key]
645 module_name = model_type_to_module_name(key)

KeyError: 'rwkv'

i guass that this model does not support hugging face api. you can try sgugger/rwkv-430M-pile according to https://huggingface.co/docs/transformers/model_doc/rwkv.

Sign up or log in to comment