500 internal error when inference from model
#15
by
Diaa-Essam
- opened
This error keeps showing when using for inference from hugging face api:
500 Server Error: Internal Server Error for url: https://api-inference.huggingface.co/models/lmsys/fastchat-t5-3b-v1.0