Update README.md
Browse files
README.md
CHANGED
@@ -54,13 +54,13 @@ Resources:
|
|
54 |
|
55 |
```bash
|
56 |
# mmproj
|
57 |
-
wget https://huggingface.co/xtuner/llava-llama-3-8b-v1_1-gguf/resolve/main/mmproj-
|
58 |
|
59 |
# fp16 llm
|
60 |
-
wget https://huggingface.co/xtuner/llava-llama-3-8b-v1_1-gguf/resolve/main/
|
61 |
|
62 |
# int4 llm
|
63 |
-
wget https://huggingface.co/xtuner/llava-llama-3-8b-v1_1-gguf/resolve/main/
|
64 |
```
|
65 |
|
66 |
### Build environment
|
@@ -70,12 +70,14 @@ wget https://huggingface.co/xtuner/llava-llama-3-8b-v1_1-gguf/resolve/main/ggml-
|
|
70 |
|
71 |
### Chat by `./llava-cli`
|
72 |
|
|
|
|
|
73 |
```bash
|
74 |
# fp16
|
75 |
-
./llava-cli -m ./
|
76 |
|
77 |
# int4
|
78 |
-
./llava-cli -m ./
|
79 |
```
|
80 |
|
81 |
### Reproduce
|
|
|
54 |
|
55 |
```bash
|
56 |
# mmproj
|
57 |
+
wget https://huggingface.co/xtuner/llava-llama-3-8b-v1_1-gguf/resolve/main/llava-llama-3-8b-v1_1-mmproj-f16.gguf
|
58 |
|
59 |
# fp16 llm
|
60 |
+
wget https://huggingface.co/xtuner/llava-llama-3-8b-v1_1-gguf/resolve/main/llava-llama-3-8b-v1_1-f16.gguf
|
61 |
|
62 |
# int4 llm
|
63 |
+
wget https://huggingface.co/xtuner/llava-llama-3-8b-v1_1-gguf/resolve/main/llava-llama-3-8b-v1_1-int4.gguf
|
64 |
```
|
65 |
|
66 |
### Build environment
|
|
|
70 |
|
71 |
### Chat by `./llava-cli`
|
72 |
|
73 |
+
Note: llava-llama-3-8b-v1_1 uses the Llama-3-instruct chat template.
|
74 |
+
|
75 |
```bash
|
76 |
# fp16
|
77 |
+
./llava-cli -m ./llava-llama-3-8b-v1_1-f16.gguf --mmproj ./llava-llama-3-8b-v1_1-mmproj-f16.gguf --image YOUR_IMAGE.jpg -c 4096
|
78 |
|
79 |
# int4
|
80 |
+
./llava-cli -m ./llava-llama-3-8b-v1_1-int4.gguf --mmproj ./llava-llama-3-8b-v1_1-mmproj-f16.gguf --image YOUR_IMAGE.jpg -c 4096
|
81 |
```
|
82 |
|
83 |
### Reproduce
|