runtime error

None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used. Traceback (most recent call last): File "app.py", line 31, in <module> service_context = ServiceContext.from_defaults(llm_predictor=llm_predictor, chunk_size_limit=512) File "/home/user/app/gpt_index/indices/service_context.py", line 71, in from_defaults embed_model = embed_model or OpenAIEmbedding() File "/home/user/app/gpt_index/embeddings/openai.py", line 209, in __init__ super().__init__(**kwargs) File "/home/user/app/gpt_index/embeddings/base.py", line 55, in __init__ self._tokenizer: Callable = globals_helper.tokenizer File "/home/user/app/gpt_index/utils.py", line 61, in tokenizer tokenizer = transformers.GPT2TokenizerFast.from_pretrained("gpt2") File "/home/user/.local/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1788, in from_pretrained raise EnvironmentError( OSError: Can't load tokenizer for 'gpt2'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'gpt2' is the correct path to a directory containing all relevant files for a GPT2TokenizerFast tokenizer.

Container logs:

Fetching error logs...