Transformers
PyTorch
English
trl
rlhf

OSError: trl-lib/llama-7b-se-rl-peft does not appear to have a file named config.json.

#9
by kmfoda - opened

Hello, thanks for contributing this amazing model. When I try and laod it using:

from transformers import AutoModel
model = AutoModel.from_pretrained("trl-lib/llama-7b-se-rl-peft")

I get the following error:

OSError: trl-lib/llama-7b-se-rl-peft does not appear to have a file named config.json. Checkout 'https://huggingface.co/trl-lib/llama-7b-se-rl-peft/main' for available files.

Is there perhaps an intermediary step I need to carry out before loading the model?

Same when trying deploy it as huggingface inference endpoint:

2023/11/17 13:27:39 ~ self.pipeline = get_pipeline(model_dir=model_dir, task=task)
2023/11/17 13:27:39 ~ await handler()
2023/11/17 13:27:39 ~ File "/opt/conda/lib/python3.9/site-packages/starlette/routing.py", line 584, in aenter
2023/11/17 13:27:39 ~ File "/opt/conda/lib/python3.9/site-packages/starlette/routing.py", line 705, in lifespan
6cn7t 2023-11-17T12:27:39.318Z
2023/11/17 13:27:39 ~ OSError: /repository does not appear to have a file named config.json. Checkout 'https://huggingface.co//repository/None' for available files.
2023/11/17 13:27:39 ~ config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
2023/11/17 13:27:39 ~ File "/app/huggingface_inference_toolkit/utils.py", line 261, in get_pipeline
2023/11/17 13:27:39 ~ File "/app/webservice_starlette.py", line 57, in some_startup_task
2023/11/17 13:27:39 ~ File "/opt/conda/lib/python3.9/site-packages/starlette/routing.py", line 682, in startup
2023/11/17 13:27:39 ~ hf_pipeline = pipeline(task=task, model=model_dir, device=device, **kwargs)
2023/11/17 13:27:39 ~ config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
2023/11/17 13:27:39 ~ raise EnvironmentError(
2023/11/17 13:27:39 ~ config = AutoConfig.from_pretrained(model, _from_pipeline=task, **hub_kwargs, **model_kwargs)
2023/11/17 13:27:39 ~ await self._router.startup()
2023/11/17 13:27:39 ~ return HuggingFaceHandler(model_dir=model_dir, task=task)
2023/11/17 13:27:39 ~ inference_handler = get_inference_handler_either_custom_or_default_handler(HF_MODEL_DIR, task=HF_TASK)
2023/11/17 13:27:39 ~ Application startup failed. Exiting.
2023/11/17 13:27:39 ~ File "/opt/conda/lib/python3.9/site-packages/transformers/utils/hub.py", line 388, in cached_file
2023/11/17 13:27:39 ~ resolved_config_file = cached_file(
2023/11/17 13:27:39 ~ File "/opt/conda/lib/python3.9/site-packages/transformers/configuration_utils.py", line 672, in _get_config_dict
2023/11/17 13:27:39 ~ File "/opt/conda/lib/python3.9/site-packages/transformers/configuration_utils.py", line 617, in get_config_dict
2023/11/17 13:27:39 ~ File "/opt/conda/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py", line 983, in from_pretrained
2023/11/17 13:27:39 ~ File "/app/huggingface_inference_toolkit/handler.py", line 17, in init
2023/11/17 13:27:39 ~ File "/app/huggingface_inference_toolkit/handler.py", line 45, in get_inference_handler_either_custom_or_default_handler
2023/11/17 13:27:39 ~ File "/opt/conda/lib/python3.9/site-packages/transformers/pipelines/init.py", line 705, in pipeline
2023/11/17 13:27:39 ~ async with self.lifespan_context(app) as maybe_state:
2023/11/17 13:27:39 ~ Traceback (most recent call last):

Sign up or log in to comment