TheBloke commited on
Commit
cc63bcf
1 Parent(s): 5fa6d7a

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -94,7 +94,7 @@ Documentation on installing and using vLLM [can be found here](https://vllm.read
94
  - When using vLLM as a server, pass the `--quantization awq` parameter, for example:
95
 
96
  ```shell
97
- python3 python -m vllm.entrypoints.api_server --model TheBloke/Llama-2-7B-GPTQ --quantization awq
98
  ```
99
 
100
  When using vLLM from Python code, pass the `quantization=awq` parameter, for example:
 
94
  - When using vLLM as a server, pass the `--quantization awq` parameter, for example:
95
 
96
  ```shell
97
+ python3 python -m vllm.entrypoints.api_server --model TheBloke/Llama-2-7B-AWQ --quantization awq
98
  ```
99
 
100
  When using vLLM from Python code, pass the `quantization=awq` parameter, for example: