Text Generation
PyTorch
causal-lm
rwkv

How to load rwkv-4-world from_pretrained()

#4
by Bk9x - opened

How to load rwkv-4-world from_pretrained()

@Bk9x

Load model directly

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("BlinkDL/rwkv-4-world")
model = AutoModelForCausalLM.from_pretrained("BlinkDL/rwkv-4-world")

HTTPError Traceback (most recent call last)

/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_errors.py in hf_raise_for_status(response, endpoint_name)
260 try:
--> 261 response.raise_for_status()
262 except HTTPError as e:

12 frames

HTTPError: 404 Client Error: Not Found for url: https://huggingface.co/BlinkDL/rwkv-4-world/resolve/main/config.json

The above exception was the direct cause of the following exception:

EntryNotFoundError Traceback (most recent call last)

EntryNotFoundError: 404 Client Error. (Request ID: Root=1-64abe2d6-0079ece4247a30126051b230;5dbd5d3d-d120-405c-b114-57c3f86f39a4)

Entry Not Found for url: https://huggingface.co/BlinkDL/rwkv-4-world/resolve/main/config.json.

During handling of the above exception, another exception occurred:

OSError Traceback (most recent call last)

/usr/local/lib/python3.10/dist-packages/transformers/utils/hub.py in cached_file(path_or_repo_id, filename, cache_dir, force_download, resume_download, proxies, use_auth_token, revision, local_files_only, subfolder, repo_type, user_agent, _raise_exceptions_for_missing_entries, _raise_exceptions_for_connection_errors, _commit_hash)
461 if revision is None:
462 revision = "main"
--> 463 raise EnvironmentError(
464 f"{path_or_repo_id} does not appear to have a file named {full_filename}. Checkout "
465 f"'https://huggingface.co/{path_or_repo_id}/{revision}' for available files."

OSError: BlinkDL/rwkv-4-world does not appear to have a file named config.json. Checkout 'https://huggingface.co/BlinkDL/rwkv-4-world/main' for available files.

deleted

It seems that it's not compatible with hugging face transformers.

https://huggingface.co/RWKV/rwkv-4-world-7b go here this was how I got around it to use the 7b version

Sign up or log in to comment