bartowski commited on
Commit
2eecdd3
β€’
1 Parent(s): 241ab5c

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +3 -11
README.md CHANGED
@@ -1,12 +1,6 @@
1
  ---
2
- base_model: Qwen/Qwen2.5-7B-Instruct
3
- language:
4
- - en
5
- license: apache-2.0
6
- pipeline_tag: text-generation
7
- tags:
8
- - chat
9
  quantized_by: bartowski
 
10
  ---
11
  ## πŸ’« Community Model> Qwen2.5 7B Instruct by Qwen
12
 
@@ -18,16 +12,14 @@ quantized_by: bartowski
18
 
19
  ## Technical Details
20
 
21
- Long context: Support for 128k tokens and 8k token generation
22
 
23
- Large-scale training dataset: Encompasses a huge range of knowledge.
24
 
25
  Enhanced structured data understanding and generation.
26
 
27
  Over 29 languages including Chinese, English, French, Spanish, Portuguese, German, Italian, Russian, Japanese, Korean, Vietnamese, Thai, and Arabic.
28
 
29
- More details available [here](https://qwenlm.github.io/blog/qwen2.5/)
30
-
31
  ## Special thanks
32
 
33
  πŸ™ Special thanks to [Georgi Gerganov](https://github.com/ggerganov) and the whole team working on [llama.cpp](https://github.com/ggerganov/llama.cpp/) for making all of this possible.
 
1
  ---
 
 
 
 
 
 
 
2
  quantized_by: bartowski
3
+ pipeline_tag: text-generation
4
  ---
5
  ## πŸ’« Community Model> Qwen2.5 7B Instruct by Qwen
6
 
 
12
 
13
  ## Technical Details
14
 
15
+ Long context: Support for 128k tokens with yarn rope settings and 8k token generation
16
 
17
+ Large-scale training dataset: 18T tokens for the training encompasses a huge range of knowledge.
18
 
19
  Enhanced structured data understanding and generation.
20
 
21
  Over 29 languages including Chinese, English, French, Spanish, Portuguese, German, Italian, Russian, Japanese, Korean, Vietnamese, Thai, and Arabic.
22
 
 
 
23
  ## Special thanks
24
 
25
  πŸ™ Special thanks to [Georgi Gerganov](https://github.com/ggerganov) and the whole team working on [llama.cpp](https://github.com/ggerganov/llama.cpp/) for making all of this possible.