StarCycle commited on
Commit
2e4a5ca
1 Parent(s): 2791276

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -241,7 +241,7 @@ And the learning rate curve:
241
  ```
242
  xtuner convert pth_to_hf ./finetune.py ./work_dirs/iter_xxx.pth ./my_lora_and_projector
243
  ```
244
- The adapter still need to be used with the internlm/internlm2-chat-7b and the vision encoder. I have not tried to merge them yet but it is possible with Xtuner, see this [tutorial](https://github.com/InternLM/xtuner/blob/f63859b3d0cb39cbac709e3850f3fe01de1023aa/xtuner/configs/llava/README.md#L4).
245
 
246
  ## MMBench Evaluation
247
  You can first download the MMBench data:
 
241
  ```
242
  xtuner convert pth_to_hf ./finetune.py ./work_dirs/iter_xxx.pth ./my_lora_and_projector
243
  ```
244
+ The adapter still need to be used with the internlm/internlm2-chat-1_8b and the vision encoder. I have not tried to merge them yet but it is possible with Xtuner, see this [tutorial](https://github.com/InternLM/xtuner/blob/f63859b3d0cb39cbac709e3850f3fe01de1023aa/xtuner/configs/llava/README.md#L4).
245
 
246
  ## MMBench Evaluation
247
  You can first download the MMBench data: