Guanzheng commited on
Commit
69a93c1
1 Parent(s): 8721983

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -6
README.md CHANGED
@@ -1,5 +1,7 @@
1
  ---
2
  license: mit
 
 
3
  ---
4
 
5
  # CLEX: Continuous Length Extrapolation for Large Language Models
@@ -22,10 +24,10 @@ If you have any questions, feel free to contact us. (Emails: guanzzh.chen@gmail.
22
  |:-----|:-----|:-----------|:-----------|:-----------|:-----------|:------:|
23
  | CLEX-LLaMA-2-7B-16K | base | LLaMA-2-7B | [Redpajama-Book](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T) | 16K | 64K | [link](https://huggingface.co/DAMO-NLP-SG/CLEX-7B-16K) |
24
  | CLEX-LLaMA-2-7B-Chat-16K | chat | CLEX-7B-16K | [UltraChat](https://github.com/thunlp/UltraChat) | 16K | 64K | [link](https://huggingface.co/DAMO-NLP-SG/CLEX-7B-Chat-16K) |
25
- | CLEX-LLaMA-2-7B-64K | base | LLaMA-2-7B | [Redpajama-Book](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T) | 64k | 256K | Pending Upload |
26
- | CLEX-Phi-2-7B-32K | base | Phi-2-2.7B | [LongCorpus-2.5B](https://huggingface.co/datasets/DAMO-NLP-SG/LongCorpus-2.5B) | 32k | 128K | Pending Upload |
27
- | CLEX-Mixtral-8x7B-32K | base | Mixtral-8x7B-v0.1 | [LongCorpus-2.5B](https://huggingface.co/datasets/DAMO-NLP-SG/LongCorpus-2.5B) | 32k | >128K | Pending Upload |
28
- | CLEX-Mixtral-8x7B-Chat-32k | chat | CLEX-Mixtral-8x7B-32K | [Ultrachat 200k](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k) | 32k | >128K | Pending Upload |
29
  </div>
30
 
31
 
@@ -78,5 +80,4 @@ If you find our project useful, hope you can star our repo and cite our paper as
78
  journal = {arXiv preprint arXiv:2310.16450},
79
  url = {https://arxiv.org/abs/2310.16450}
80
  }
81
- ```
82
-
 
1
  ---
2
  license: mit
3
+ datasets:
4
+ - DAMO-NLP-SG/LongCorpus-2.5B
5
  ---
6
 
7
  # CLEX: Continuous Length Extrapolation for Large Language Models
 
24
  |:-----|:-----|:-----------|:-----------|:-----------|:-----------|:------:|
25
  | CLEX-LLaMA-2-7B-16K | base | LLaMA-2-7B | [Redpajama-Book](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T) | 16K | 64K | [link](https://huggingface.co/DAMO-NLP-SG/CLEX-7B-16K) |
26
  | CLEX-LLaMA-2-7B-Chat-16K | chat | CLEX-7B-16K | [UltraChat](https://github.com/thunlp/UltraChat) | 16K | 64K | [link](https://huggingface.co/DAMO-NLP-SG/CLEX-7B-Chat-16K) |
27
+ | CLEX-LLaMA-2-7B-64K | base | LLaMA-2-7B | [Redpajama-Book](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T) | 64k | 256K | [link](https://huggingface.co/DAMO-NLP-SG/CLEX-LLaMA-2-7B-64K) |
28
+ | CLEX-Phi-2-7B-32K | base | Phi-2-2.7B | [LongCorpus-2.5B](https://huggingface.co/datasets/DAMO-NLP-SG/LongCorpus-2.5B) | 32k | 128K | [link](https://huggingface.co/DAMO-NLP-SG/CLEX-Phi-2-2.7B-32K) |
29
+ | CLEX-Mixtral-8x7B-32K | base | Mixtral-8x7B-v0.1 | [LongCorpus-2.5B](https://huggingface.co/datasets/DAMO-NLP-SG/LongCorpus-2.5B) | 32k | >128K | [link](https://huggingface.co/DAMO-NLP-SG/CLEX-Mixtral-8x7B-32K) |
30
+ | CLEX-Mixtral-8x7B-Chat-32k | chat | CLEX-Mixtral-8x7B-32K | [Ultrachat 200k](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k) | 32k | >128K | [link](https://huggingface.co/DAMO-NLP-SG/CLEX-Mixtral-8x7B-Chat-32K) |
31
  </div>
32
 
33
 
 
80
  journal = {arXiv preprint arXiv:2310.16450},
81
  url = {https://arxiv.org/abs/2310.16450}
82
  }
83
+ ```