FuryMartin
commited on
Commit
•
d5af37e
1
Parent(s):
f453290
Update README.md
Browse files
README.md
CHANGED
@@ -6,4 +6,10 @@ tags: []
|
|
6 |
|
7 |
This is a transformers-compatible `llava-critic-7b` model converted from [lmms-lab/llava-critic-7b](https://huggingface.co/lmms-lab/llava-critic-7b/tree/main) using [convert_llava_onevision_weights_to_hf.py](https://github.com/huggingface/transformers/blob/main/src/transformers/models/llava_onevision/convert_llava_onevision_weights_to_hf.py).
|
8 |
|
9 |
-
However, there may be some **precision problems** to be fixed. **See [#34467](https://github.com/huggingface/transformers/issues/34467) for details.**
|
|
|
|
|
|
|
|
|
|
|
|
|
|
6 |
|
7 |
This is a transformers-compatible `llava-critic-7b` model converted from [lmms-lab/llava-critic-7b](https://huggingface.co/lmms-lab/llava-critic-7b/tree/main) using [convert_llava_onevision_weights_to_hf.py](https://github.com/huggingface/transformers/blob/main/src/transformers/models/llava_onevision/convert_llava_onevision_weights_to_hf.py).
|
8 |
|
9 |
+
However, there may be some **precision problems** to be fixed. **See [#34467](https://github.com/huggingface/transformers/issues/34467) for details.**
|
10 |
+
|
11 |
+
## Requirements for vLLM
|
12 |
+
|
13 |
+
The latest vLLM (0.6.3.post1) has an sever bug when serving models, See [vllm #9848](https://github.com/vllm-project/vllm/issues/9848) for details.
|
14 |
+
|
15 |
+
I recommend using `vllm==0.6.2` to avoid this issue.
|