grimulkan commited on
Commit
68b24d7
1 Parent(s): 2a85638

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -5,8 +5,10 @@ license: cc-by-nc-2.0
5
  This is a merge of [LongAlpaca-70B-lora](https://huggingface.co/Yukang/LongAlpaca-70B-lora) into lizpreciatior's [lzlv_70b_fp16_hf](https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf), and removing the extra row and pad token so that the vocabularies match.
6
 
7
  There is no additional fine-tuning. The resulting model seems to not be broken... you can test whether it is truly the original model + 32K capability (use linear rope scaling 8).
 
8
  [ChuckMcSneed](https://huggingface.co/ChuckMcSneed) did a benchmark [here](https://huggingface.co/grimulkan/lzlv-longLORA-70b-rope8-32k-fp16/discussions/2), indicating 30% degradation with 8x the context length.
9
 
 
10
  You could also try merging this with other models of longLORA descendency (like [Aurelian](https://huggingface.co/grimulkan/aurelian-v0.5-70b-rope8-32K-fp16)).
11
 
12
  A 6-bit EXL2 quantization is available [here](https://huggingface.co/grimulkan/lzlv-longLORA-70b-rope8-32k-6bpw-h8-exl2).
 
5
  This is a merge of [LongAlpaca-70B-lora](https://huggingface.co/Yukang/LongAlpaca-70B-lora) into lizpreciatior's [lzlv_70b_fp16_hf](https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf), and removing the extra row and pad token so that the vocabularies match.
6
 
7
  There is no additional fine-tuning. The resulting model seems to not be broken... you can test whether it is truly the original model + 32K capability (use linear rope scaling 8).
8
+
9
  [ChuckMcSneed](https://huggingface.co/ChuckMcSneed) did a benchmark [here](https://huggingface.co/grimulkan/lzlv-longLORA-70b-rope8-32k-fp16/discussions/2), indicating 30% degradation with 8x the context length.
10
 
11
+
12
  You could also try merging this with other models of longLORA descendency (like [Aurelian](https://huggingface.co/grimulkan/aurelian-v0.5-70b-rope8-32K-fp16)).
13
 
14
  A 6-bit EXL2 quantization is available [here](https://huggingface.co/grimulkan/lzlv-longLORA-70b-rope8-32k-6bpw-h8-exl2).