Update README.md
Browse files
README.md
CHANGED
|
@@ -4,6 +4,17 @@ inference: false
|
|
| 4 |
|
| 5 |
# longchat-7b-16k Model Card
|
| 6 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 7 |
## Model details
|
| 8 |
|
| 9 |
**Model type:**
|
|
|
|
| 4 |
|
| 5 |
# longchat-7b-16k Model Card
|
| 6 |
|
| 7 |
+
Please use load_model from FastChat or LongChat repo to load the model (or chatting API from FastChat). There is a monkey patch needed to use the model.
|
| 8 |
+
Usage referece:
|
| 9 |
+
|
| 10 |
+
(LongChat) python3 eval.py --model-name-or-path lmsys/longchat-7b-16k --task topics
|
| 11 |
+
|
| 12 |
+
(FastChat) python3 -m fastchat.serve.cli --model-path lmsys/longchat-7b-16k
|
| 13 |
+
|
| 14 |
+
Under the hood, the monkey patch is added in:
|
| 15 |
+
|
| 16 |
+
https://github.com/lm-sys/FastChat/blob/da0641e567cf93756b0978ab5a6b092e96f06240/fastchat/model/model_adapter.py#L429
|
| 17 |
+
|
| 18 |
## Model details
|
| 19 |
|
| 20 |
**Model type:**
|