Commit
·
7f556f7
1
Parent(s):
86b823a
Update README.md
Browse files
README.md
CHANGED
@@ -174,7 +174,7 @@ You may then connect to the OpenAI-compatible API endpoint with tools such as [B
|
|
174 |
## Serving with Oobabooga / text-generation-webui
|
175 |
|
176 |
The model may also be loaded via [oobabooga/text-generation-webui](https://github.com/oobabooga/text-generation-webui/) in a similar manner to other models.
|
177 |
-
See the requirements below. Note that inference with Transformers is significantly slower than using the recommended OpenChat vLLM server.
|
178 |
|
179 |
### Oobabooga Key Requirements
|
180 |
|
@@ -224,7 +224,6 @@ It should look as below:
|
|
224 |
|
225 |
Then you should be ready to generate!
|
226 |
|
227 |
-
|
228 |
# Citation
|
229 |
|
230 |
```bibtex
|
|
|
174 |
## Serving with Oobabooga / text-generation-webui
|
175 |
|
176 |
The model may also be loaded via [oobabooga/text-generation-webui](https://github.com/oobabooga/text-generation-webui/) in a similar manner to other models.
|
177 |
+
See the requirements below. Note that inference with just the Transformers library is significantly slower than using the recommended OpenChat vLLM server.
|
178 |
|
179 |
### Oobabooga Key Requirements
|
180 |
|
|
|
224 |
|
225 |
Then you should be ready to generate!
|
226 |
|
|
|
227 |
# Citation
|
228 |
|
229 |
```bibtex
|