For now, use this version of Transformers for vLLM.
#29
by
mkvn
- opened
Hello, this website has been not found.Could you please provide a new one? Thank you!
Hello, this website has been not found. Could you please provide a new one? Thank you!
This was not website, its the transformers package that support this model on vllm.
Hi! Is it a version that is not pursuiing model to answer 11111111111111111111111111111111(1) on almost every question?
Has someone figured out what happend? I'd prefer to not make separate venv with separate vllm for single model inference...