Update config.json
Endpoint encountered an error.
You can try restarting it using the "retry" button above. Check
logs for more details.
[Server message]Endpoint failed to start
See details
Exit code: 3. Reason: ustom pipeline found at /repository/handler.py
2024-10-29 04:23:34 - huggingface_inference_toolkit - INFO - Using device GPU
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/starlette/routing.py", line 693, in lifespan
async with self.lifespan_context(app) as maybe_state:
File "/usr/local/lib/python3.10/dist-packages/starlette/routing.py", line 569, in aenter
await self._router.startup()
File "/usr/local/lib/python3.10/dist-packages/starlette/routing.py", line 670, in startup
await handler()
File "/app/webservice_starlette.py", line 62, in prepare_model_artifacts
inference_handler = get_inference_handler_either_custom_or_default_handler(
File "/app/huggingface_inference_toolkit/handler.py", line 96, in get_inference_handler_either_custom_or_default_handler
return HuggingFaceHandler(model_dir=model_dir, task=task)
File "/app/huggingface_inference_toolkit/handler.py", line 19, in init
self.pipeline = get_pipeline(
File "/app/huggingface_inference_toolkit/utils.py", line 261, in get_pipeline
hf_pipeline = pipeline(task=task, model=model_dir, device=device, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/transformers/pipelines/init.py", line 805, in pipeline
config = AutoConfig.from_pretrained(
File "/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py", line 1000, in from_pretrained
return config_class.from_dict(config_dict, **unused_kwargs)
File "/usr/local/lib/python3.10/dist-packages/transformers/configuration_utils.py", line 772, in from_dict
config = cls(**config_dict)
File "/usr/local/lib/python3.10/dist-packages/transformers/models/llava/configuration_llava.py", line 104, in init
vision_config = CONFIG_MAPPINGvision_config["model_type"]
File "/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py", line 695, in getitem
raise KeyError(key)
KeyError: 'pixtral'
Thank you for your feedback.
I've updated the configuration to match the original model. If you don’t mind, could you please give it another try?
Legend
sorry looks like a same problem
BTW i am using Inference Endpoints
if there is set up im missing or a recommended way to run this let me know. Thank you.
I am very new to AI so please understand
Exit code: 3. Reason: ustom pipeline found at /repository/handler.py
2024-10-29 05:49:52 - huggingface_inference_toolkit - INFO - Using device GPU
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/starlette/routing.py", line 693, in lifespan
async with self.lifespan_context(app) as maybe_state:
File "/usr/local/lib/python3.10/dist-packages/starlette/routing.py", line 569, in aenter
await self._router.startup()
File "/usr/local/lib/python3.10/dist-packages/starlette/routing.py", line 670, in startup
await handler()
File "/app/webservice_starlette.py", line 62, in prepare_model_artifacts
inference_handler = get_inference_handler_either_custom_or_default_handler(
File "/app/huggingface_inference_toolkit/handler.py", line 96, in get_inference_handler_either_custom_or_default_handler
return HuggingFaceHandler(model_dir=model_dir, task=task)
File "/app/huggingface_inference_toolkit/handler.py", line 19, in init
self.pipeline = get_pipeline(
File "/app/huggingface_inference_toolkit/utils.py", line 261, in get_pipeline
hf_pipeline = pipeline(task=task, model=model_dir, device=device, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/transformers/pipelines/init.py", line 805, in pipeline
config = AutoConfig.from_pretrained(
File "/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py", line 1000, in from_pretrained
return config_class.from_dict(config_dict, **unused_kwargs)
File "/usr/local/lib/python3.10/dist-packages/transformers/configuration_utils.py", line 772, in from_dict
config = cls(**config_dict)
File "/usr/local/lib/python3.10/dist-packages/transformers/models/llava/configuration_llava.py", line 104, in init
vision_config = CONFIG_MAPPINGvision_config["model_type"]
File "/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py", line 695, in getitem
raise KeyError(key)
KeyError: 'pixtral'
Thank you so much for taking the time to review and provide this valuable feedback.
I typically test my models locally, so I’m less familiar with the Hugging Face Inference Endpoints setup. It’s possible that there’s a configuration issue, perhaps related to the model’s setup on their platform. Could you please check your transformers version?
Pixtral requires transformers version 4.45.1 or above. For more details, you can refer to the original repository.
Yeh I have checked the transformer version is 4.38.2 thank you for your help
Glad to hear :)
영우 행님 화이팅!