Spaces:
Running
Running
Add model files and tokenizer configuration; include special tokens and initialization settings
e945d50
2024-11-16 23:03:13,077 - main - INFO - Loading tokenizer... | |
2024-11-16 23:03:13,077 - main - ERROR - Error initializing model: Incorrect path_or_model_id: '\poetica'. Please provide either the path to a local folder or the repo_id of a model on the Hub. | |
2024-11-16 23:03:13,078 - main - ERROR - Failed to initialize model manager | |
2024-11-16 23:03:48,134 - main - INFO - Loading tokenizer... | |
2024-11-16 23:03:48,135 - main - ERROR - Error initializing model: \ does not appear to have a file named config.json. Checkout 'https://huggingface.co/\/tree/None' for available files. | |
2024-11-16 23:03:48,135 - main - ERROR - Failed to initialize model manager | |
2024-11-16 23:05:52,528 - main - INFO - Loading tokenizer... | |
2024-11-16 23:05:52,530 - main - ERROR - Error initializing model: Incorrect path_or_model_id: './models/tokenizer_config.json'. Please provide either the path to a local folder or the repo_id of a model on the Hub. | |
2024-11-16 23:05:52,530 - main - ERROR - Failed to initialize model manager | |
2024-11-16 23:06:20,012 - main - INFO - Loading tokenizer... | |
2024-11-16 23:06:20,012 - main - ERROR - Error initializing model: Incorrect path_or_model_id: './models/tokenizer_config.json'. Please provide either the path to a local folder or the repo_id of a model on the Hub. | |
2024-11-16 23:06:20,012 - main - ERROR - Failed to initialize model manager | |
2024-11-16 23:07:40,051 - main - INFO - Loading tokenizer... | |
2024-11-16 23:07:40,135 - main - ERROR - Error initializing model: expected str, bytes or os.PathLike object, not NoneType | |
2024-11-16 23:07:40,136 - main - ERROR - Detailed traceback: | |
Traceback (most recent call last): | |
File "E:\Self Work\My Projects\Poetica HuggingFace Server\poetica\main.py", line 64, in initialize | |
self.tokenizer = AutoTokenizer.from_pretrained( | |
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 896, in from_pretrained | |
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs) | |
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\tokenization_utils_base.py", line 2291, in from_pretrained | |
return cls._from_pretrained( | |
^^^^^^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\tokenization_utils_base.py", line 2329, in _from_pretrained | |
slow_tokenizer = (cls.slow_tokenizer_class)._from_pretrained( | |
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\tokenization_utils_base.py", line 2525, in _from_pretrained | |
tokenizer = cls(*init_inputs, **init_kwargs) | |
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\models\gpt2\tokenization_gpt2.py", line 159, in __init__ | |
with open(merges_file, encoding="utf-8") as merges_handle: | |
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
TypeError: expected str, bytes or os.PathLike object, not NoneType | |
2024-11-16 23:07:40,163 - main - ERROR - Failed to initialize model manager | |
2024-11-16 23:07:56,229 - main - INFO - Loading tokenizer... | |
2024-11-16 23:07:56,274 - main - ERROR - Error initializing model: expected str, bytes or os.PathLike object, not NoneType | |
2024-11-16 23:07:56,275 - main - ERROR - Detailed traceback: | |
Traceback (most recent call last): | |
File "E:\Self Work\My Projects\Poetica HuggingFace Server\poetica\main.py", line 64, in initialize | |
self.tokenizer = AutoTokenizer.from_pretrained( | |
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 896, in from_pretrained | |
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs) | |
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\tokenization_utils_base.py", line 2291, in from_pretrained | |
return cls._from_pretrained( | |
^^^^^^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\tokenization_utils_base.py", line 2329, in _from_pretrained | |
slow_tokenizer = (cls.slow_tokenizer_class)._from_pretrained( | |
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\tokenization_utils_base.py", line 2525, in _from_pretrained | |
tokenizer = cls(*init_inputs, **init_kwargs) | |
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\models\gpt2\tokenization_gpt2.py", line 159, in __init__ | |
with open(merges_file, encoding="utf-8") as merges_handle: | |
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
TypeError: expected str, bytes or os.PathLike object, not NoneType | |
2024-11-16 23:07:56,280 - main - ERROR - Failed to initialize model manager | |
2024-11-16 23:09:15,013 - main - INFO - Loading tokenizer... | |
2024-11-16 23:09:15,021 - main - ERROR - Error initializing model: expected `,` or `}` at line 2 column 18 | |
2024-11-16 23:09:15,021 - main - ERROR - Detailed traceback: | |
Traceback (most recent call last): | |
File "E:\Self Work\My Projects\Poetica HuggingFace Server\poetica\main.py", line 80, in initialize | |
self.tokenizer = GPT2TokenizerFast( | |
^^^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\models\gpt2\tokenization_gpt2_fast.py", line 99, in __init__ | |
super().__init__( | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\tokenization_utils_fast.py", line 115, in __init__ | |
fast_tokenizer = TokenizerFast.from_file(fast_tokenizer_file) | |
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
Exception: expected `,` or `}` at line 2 column 18 | |
2024-11-16 23:09:15,022 - main - ERROR - Failed to initialize model manager | |
2024-11-16 23:09:32,778 - main - INFO - Loading tokenizer... | |
2024-11-16 23:09:32,778 - main - ERROR - Error initializing model: expected `,` or `}` at line 2 column 18 | |
2024-11-16 23:09:32,779 - main - ERROR - Detailed traceback: | |
Traceback (most recent call last): | |
File "E:\Self Work\My Projects\Poetica HuggingFace Server\poetica\main.py", line 80, in initialize | |
self.tokenizer = GPT2TokenizerFast( | |
^^^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\models\gpt2\tokenization_gpt2_fast.py", line 99, in __init__ | |
super().__init__( | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\tokenization_utils_fast.py", line 115, in __init__ | |
fast_tokenizer = TokenizerFast.from_file(fast_tokenizer_file) | |
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
Exception: expected `,` or `}` at line 2 column 18 | |
2024-11-16 23:09:32,780 - main - ERROR - Failed to initialize model manager | |
2024-11-16 23:10:31,968 - main - INFO - Loading tokenizer... | |
2024-11-16 23:10:31,978 - urllib3.connectionpool - DEBUG - Starting new HTTPS connection (1): huggingface.co:443 | |
2024-11-16 23:10:32,575 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "HEAD /gpt2/resolve/main/tokenizer_config.json HTTP/11" 200 0 | |
2024-11-16 23:10:32,579 - filelock - DEBUG - Attempting to acquire lock 3194343144000 on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\be4d21d94f3b4687e5a54d84bf6ab46ed0f8defd.lock | |
2024-11-16 23:10:32,580 - filelock - DEBUG - Lock 3194343144000 acquired on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\be4d21d94f3b4687e5a54d84bf6ab46ed0f8defd.lock | |
2024-11-16 23:10:33,328 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "GET /gpt2/resolve/main/tokenizer_config.json HTTP/11" 200 26 | |
2024-11-16 23:10:33,362 - filelock - DEBUG - Attempting to release lock 3194343144000 on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\be4d21d94f3b4687e5a54d84bf6ab46ed0f8defd.lock | |
2024-11-16 23:10:33,363 - filelock - DEBUG - Lock 3194343144000 released on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\be4d21d94f3b4687e5a54d84bf6ab46ed0f8defd.lock | |
2024-11-16 23:10:33,624 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "HEAD /gpt2/resolve/main/config.json HTTP/11" 200 0 | |
2024-11-16 23:10:33,631 - filelock - DEBUG - Attempting to acquire lock 3194374134560 on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\10c66461e4c109db5a2196bff4bb59be30396ed8.lock | |
2024-11-16 23:10:33,632 - filelock - DEBUG - Lock 3194374134560 acquired on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\10c66461e4c109db5a2196bff4bb59be30396ed8.lock | |
2024-11-16 23:10:33,922 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "GET /gpt2/resolve/main/config.json HTTP/11" 200 665 | |
2024-11-16 23:10:33,926 - filelock - DEBUG - Attempting to release lock 3194374134560 on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\10c66461e4c109db5a2196bff4bb59be30396ed8.lock | |
2024-11-16 23:10:33,926 - filelock - DEBUG - Lock 3194374134560 released on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\10c66461e4c109db5a2196bff4bb59be30396ed8.lock | |
2024-11-16 23:10:34,218 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "HEAD /gpt2/resolve/main/vocab.json HTTP/11" 200 0 | |
2024-11-16 23:10:34,219 - filelock - DEBUG - Attempting to acquire lock 3194378990896 on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\1f1d9aaca301414e7f6c9396df506798ff4eb9a6.lock | |
2024-11-16 23:10:34,220 - filelock - DEBUG - Lock 3194378990896 acquired on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\1f1d9aaca301414e7f6c9396df506798ff4eb9a6.lock | |
2024-11-16 23:10:34,475 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "GET /gpt2/resolve/main/vocab.json HTTP/11" 200 1042301 | |
2024-11-16 23:10:35,729 - filelock - DEBUG - Attempting to release lock 3194378990896 on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\1f1d9aaca301414e7f6c9396df506798ff4eb9a6.lock | |
2024-11-16 23:10:35,729 - filelock - DEBUG - Lock 3194378990896 released on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\1f1d9aaca301414e7f6c9396df506798ff4eb9a6.lock | |
2024-11-16 23:10:36,400 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "HEAD /gpt2/resolve/main/merges.txt HTTP/11" 200 0 | |
2024-11-16 23:10:36,402 - filelock - DEBUG - Attempting to acquire lock 3194378990896 on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\226b0752cac7789c48f0cb3ec53eda48b7be36cc.lock | |
2024-11-16 23:10:36,403 - filelock - DEBUG - Lock 3194378990896 acquired on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\226b0752cac7789c48f0cb3ec53eda48b7be36cc.lock | |
2024-11-16 23:10:36,670 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "GET /gpt2/resolve/main/merges.txt HTTP/11" 200 456318 | |
2024-11-16 23:10:36,918 - filelock - DEBUG - Attempting to release lock 3194378990896 on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\226b0752cac7789c48f0cb3ec53eda48b7be36cc.lock | |
2024-11-16 23:10:36,919 - filelock - DEBUG - Lock 3194378990896 released on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\226b0752cac7789c48f0cb3ec53eda48b7be36cc.lock | |
2024-11-16 23:10:37,180 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "HEAD /gpt2/resolve/main/tokenizer.json HTTP/11" 200 0 | |
2024-11-16 23:10:37,183 - filelock - DEBUG - Attempting to acquire lock 3194378979184 on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\4b988bccc9dc5adacd403c00b4704976196548f8.lock | |
2024-11-16 23:10:37,184 - filelock - DEBUG - Lock 3194378979184 acquired on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\4b988bccc9dc5adacd403c00b4704976196548f8.lock | |
2024-11-16 23:10:37,879 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "GET /gpt2/resolve/main/tokenizer.json HTTP/11" 200 1355256 | |
2024-11-16 23:10:39,156 - filelock - DEBUG - Attempting to release lock 3194378979184 on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\4b988bccc9dc5adacd403c00b4704976196548f8.lock | |
2024-11-16 23:10:39,157 - filelock - DEBUG - Lock 3194378979184 released on C:\Users\asus\.cache\huggingface\hub\.locks\models--gpt2\4b988bccc9dc5adacd403c00b4704976196548f8.lock | |
2024-11-16 23:10:39,435 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "HEAD /gpt2/resolve/main/added_tokens.json HTTP/11" 404 0 | |
2024-11-16 23:10:39,694 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "HEAD /gpt2/resolve/main/special_tokens_map.json HTTP/11" 404 0 | |
2024-11-16 23:10:39,884 - main - WARNING - Could not load custom vocabulary: property 'vocab' of 'GPT2TokenizerFast' object has no setter | |
2024-11-16 23:10:39,884 - main - INFO - Loading model... | |
2024-11-16 23:10:39,885 - main - ERROR - Error initializing model: Incorrect path_or_model_id: './models\poeticagpt-quantized-new.pth'. Please provide either the path to a local folder or the repo_id of a model on the Hub. | |
2024-11-16 23:10:39,885 - main - ERROR - Detailed traceback: | |
Traceback (most recent call last): | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\utils\hub.py", line 402, in cached_file | |
resolved_file = hf_hub_download( | |
^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\huggingface_hub\utils\_validators.py", line 106, in _inner_fn | |
validate_repo_id(arg_value) | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\huggingface_hub\utils\_validators.py", line 160, in validate_repo_id | |
raise HFValidationError( | |
huggingface_hub.errors.HFValidationError: Repo id must use alphanumeric chars or '-', '_', '.', '--' and '..' are forbidden, '-' and '.' cannot start or end the name, max length is 96: './models\poeticagpt-quantized-new.pth'. | |
The above exception was the direct cause of the following exception: | |
Traceback (most recent call last): | |
File "E:\Self Work\My Projects\Poetica HuggingFace Server\poetica\main.py", line 66, in initialize | |
self.model = AutoModelForCausalLM.from_pretrained(model_path, local_files_only=True) | |
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\models\auto\auto_factory.py", line 485, in from_pretrained | |
resolved_config_file = cached_file( | |
^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\utils\hub.py", line 466, in cached_file | |
raise EnvironmentError( | |
OSError: Incorrect path_or_model_id: './models\poeticagpt-quantized-new.pth'. Please provide either the path to a local folder or the repo_id of a model on the Hub. | |
2024-11-16 23:10:39,916 - main - ERROR - Failed to initialize model manager | |
2024-11-16 23:11:46,212 - main - INFO - Loading tokenizer... | |
2024-11-16 23:11:46,216 - urllib3.connectionpool - DEBUG - Starting new HTTPS connection (1): huggingface.co:443 | |
2024-11-16 23:11:47,226 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "HEAD /gpt2/resolve/main/tokenizer_config.json HTTP/11" 200 0 | |
2024-11-16 23:11:47,382 - main - WARNING - Could not load custom vocabulary: property 'vocab' of 'GPT2TokenizerFast' object has no setter | |
2024-11-16 23:11:47,382 - main - INFO - Loading model... | |
2024-11-16 23:11:47,383 - main - ERROR - Error initializing model: Incorrect path_or_model_id: './models/poeticagpt-quantized-new.pth'. Please provide either the path to a local folder or the repo_id of a model on the Hub. | |
2024-11-16 23:11:47,383 - main - ERROR - Detailed traceback: | |
Traceback (most recent call last): | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\utils\hub.py", line 402, in cached_file | |
resolved_file = hf_hub_download( | |
^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\huggingface_hub\utils\_validators.py", line 106, in _inner_fn | |
validate_repo_id(arg_value) | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\huggingface_hub\utils\_validators.py", line 154, in validate_repo_id | |
raise HFValidationError( | |
huggingface_hub.errors.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': './models/poeticagpt-quantized-new.pth'. Use `repo_type` argument if needed. | |
The above exception was the direct cause of the following exception: | |
Traceback (most recent call last): | |
File "E:\Self Work\My Projects\Poetica HuggingFace Server\poetica\main.py", line 66, in initialize | |
self.model = AutoModelForCausalLM.from_pretrained(model_path, local_files_only=True) | |
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\models\auto\auto_factory.py", line 485, in from_pretrained | |
resolved_config_file = cached_file( | |
^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\utils\hub.py", line 466, in cached_file | |
raise EnvironmentError( | |
OSError: Incorrect path_or_model_id: './models/poeticagpt-quantized-new.pth'. Please provide either the path to a local folder or the repo_id of a model on the Hub. | |
2024-11-16 23:11:47,386 - main - ERROR - Failed to initialize model manager | |
2024-11-16 23:12:20,475 - main - INFO - Loading tokenizer... | |
2024-11-16 23:12:20,478 - urllib3.connectionpool - DEBUG - Starting new HTTPS connection (1): huggingface.co:443 | |
2024-11-16 23:12:21,010 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "HEAD /gpt2/resolve/main/tokenizer_config.json HTTP/11" 200 0 | |
2024-11-16 23:12:21,169 - main - WARNING - Could not load custom vocabulary: property 'vocab' of 'GPT2TokenizerFast' object has no setter | |
2024-11-16 23:12:21,169 - main - INFO - Loading model... | |
2024-11-16 23:12:21,170 - main - ERROR - Error initializing model: Incorrect path_or_model_id: './models/poeticagpt.pth'. Please provide either the path to a local folder or the repo_id of a model on the Hub. | |
2024-11-16 23:12:21,170 - main - ERROR - Detailed traceback: | |
Traceback (most recent call last): | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\utils\hub.py", line 402, in cached_file | |
resolved_file = hf_hub_download( | |
^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\huggingface_hub\utils\_validators.py", line 106, in _inner_fn | |
validate_repo_id(arg_value) | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\huggingface_hub\utils\_validators.py", line 154, in validate_repo_id | |
raise HFValidationError( | |
huggingface_hub.errors.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': './models/poeticagpt.pth'. Use `repo_type` argument if needed. | |
The above exception was the direct cause of the following exception: | |
Traceback (most recent call last): | |
File "E:\Self Work\My Projects\Poetica HuggingFace Server\poetica\main.py", line 66, in initialize | |
self.model = AutoModelForCausalLM.from_pretrained(model_path, local_files_only=True) | |
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\models\auto\auto_factory.py", line 485, in from_pretrained | |
resolved_config_file = cached_file( | |
^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\utils\hub.py", line 466, in cached_file | |
raise EnvironmentError( | |
OSError: Incorrect path_or_model_id: './models/poeticagpt.pth'. Please provide either the path to a local folder or the repo_id of a model on the Hub. | |
2024-11-16 23:12:21,173 - main - ERROR - Failed to initialize model manager | |
2024-11-16 23:13:01,527 - main - INFO - Loading tokenizer... | |
2024-11-16 23:13:01,531 - urllib3.connectionpool - DEBUG - Starting new HTTPS connection (1): huggingface.co:443 | |
2024-11-16 23:13:02,097 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "HEAD /gpt2/resolve/main/tokenizer_config.json HTTP/11" 200 0 | |
2024-11-16 23:13:02,216 - main - INFO - Loading model... | |
2024-11-16 23:13:02,217 - main - ERROR - Model file not found at poetica\models\poeticagpt.pth\poeticagpt.pth | |
2024-11-16 23:13:02,217 - main - ERROR - Failed to initialize model manager | |
2024-11-16 23:13:08,762 - main - INFO - Loading tokenizer... | |
2024-11-16 23:13:08,765 - urllib3.connectionpool - DEBUG - Starting new HTTPS connection (1): huggingface.co:443 | |
2024-11-16 23:13:09,732 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "HEAD /gpt2/resolve/main/tokenizer_config.json HTTP/11" 200 0 | |
2024-11-16 23:13:09,845 - main - INFO - Loading model... | |
2024-11-16 23:13:09,846 - main - ERROR - Model file not found at poetica\models\poeticagpt.pth | |
2024-11-16 23:13:09,846 - main - ERROR - Failed to initialize model manager | |
2024-11-16 23:13:33,649 - main - INFO - Loading tokenizer... | |
2024-11-16 23:13:33,652 - urllib3.connectionpool - DEBUG - Starting new HTTPS connection (1): huggingface.co:443 | |
2024-11-16 23:13:34,215 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "HEAD /gpt2/resolve/main/tokenizer_config.json HTTP/11" 200 0 | |
2024-11-16 23:13:34,334 - main - INFO - Loading model... | |
2024-11-16 23:13:34,335 - main - ERROR - Model file not found at .\poeticagpt.pth | |
2024-11-16 23:13:34,335 - main - ERROR - Failed to initialize model manager | |
2024-11-16 23:14:10,849 - main - INFO - Loading tokenizer... | |
2024-11-16 23:14:10,852 - urllib3.connectionpool - DEBUG - Starting new HTTPS connection (1): huggingface.co:443 | |
2024-11-16 23:14:11,883 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "HEAD /gpt2/resolve/main/tokenizer_config.json HTTP/11" 200 0 | |
2024-11-16 23:14:12,000 - main - INFO - Loading model... | |
2024-11-16 23:14:12,001 - main - ERROR - Error initializing model: Incorrect path_or_model_id: './poeticagpt.pth'. Please provide either the path to a local folder or the repo_id of a model on the Hub. | |
2024-11-16 23:14:12,001 - main - ERROR - Detailed traceback: | |
Traceback (most recent call last): | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\utils\hub.py", line 402, in cached_file | |
resolved_file = hf_hub_download( | |
^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\huggingface_hub\utils\_validators.py", line 106, in _inner_fn | |
validate_repo_id(arg_value) | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\huggingface_hub\utils\_validators.py", line 160, in validate_repo_id | |
raise HFValidationError( | |
huggingface_hub.errors.HFValidationError: Repo id must use alphanumeric chars or '-', '_', '.', '--' and '..' are forbidden, '-' and '.' cannot start or end the name, max length is 96: './poeticagpt.pth'. | |
The above exception was the direct cause of the following exception: | |
Traceback (most recent call last): | |
File "E:\Self Work\My Projects\Poetica HuggingFace Server\poetica\main.py", line 66, in initialize | |
self.model = AutoModelForCausalLM.from_pretrained(model_path, local_files_only=True) | |
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\models\auto\auto_factory.py", line 485, in from_pretrained | |
resolved_config_file = cached_file( | |
^^^^^^^^^^^^ | |
File "e:\Self Work\My Projects\Poetica HuggingFace Server\.venv\Lib\site-packages\transformers\utils\hub.py", line 466, in cached_file | |
raise EnvironmentError( | |
OSError: Incorrect path_or_model_id: './poeticagpt.pth'. Please provide either the path to a local folder or the repo_id of a model on the Hub. | |
2024-11-16 23:14:12,003 - main - ERROR - Failed to initialize model manager | |
2024-11-16 23:14:22,432 - main - INFO - Loading tokenizer... | |
2024-11-16 23:14:22,435 - urllib3.connectionpool - DEBUG - Starting new HTTPS connection (1): huggingface.co:443 | |
2024-11-16 23:14:22,975 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "HEAD /gpt2/resolve/main/tokenizer_config.json HTTP/11" 200 0 | |
2024-11-16 23:14:23,132 - main - WARNING - Could not load custom vocabulary: property 'vocab' of 'GPT2TokenizerFast' object has no setter | |
2024-11-16 23:14:23,132 - main - INFO - Loading model... | |
2024-11-16 23:14:23,132 - main - ERROR - Model file not found at ./models/poeticagpt.pth | |
2024-11-16 23:14:23,134 - main - ERROR - Failed to initialize model manager | |