Text Generation
Transformers
PyTorch
Chinese
English
llama
text-generation-inference

不支持Vicuna-v1.3?

#4
by acupofespresso - opened

2023-06-24 18:08:57 | ERROR | stderr |
2023-06-24 18:08:57 | ERROR | stderr | /usr/local/lib/python3.10/dist-packages/fastchat/model/model_adapter.py:321: UserWarning:
2023-06-24 18:08:57 | ERROR | stderr | You are probably using the old Vicuna-v0 model, which will generate unexpected results with the current fastchat.
2023-06-24 18:08:57 | ERROR | stderr | You can try one of the following methods:
2023-06-24 18:08:57 | ERROR | stderr | 1. Upgrade your weights to the new Vicuna-v1.3: https://github.com/lm-sys/FastChat#vicuna-weights.
2023-06-24 18:08:57 | ERROR | stderr | 2. Use the old conversation template by python3 -m fastchat.serve.cli --model-path /path/to/vicuna-v0 --conv-template conv_one_shot
2023-06-24 18:08:57 | ERROR | stderr | 3. Downgrade fschat to fschat==0.1.10 (Not recommonded).

由于baichuan的词表是64k,比llama的32k大,会触发fastchat的warning,但是不影响使用。

fireballoon changed discussion status to closed

Sign up or log in to comment