nitky commited on
Commit
95755a1
โ€ข
1 Parent(s): d5fc5b4

Upload README.md.txt

Browse files
Files changed (1) hide show
  1. README.md.txt +150 -0
README.md.txt ADDED
@@ -0,0 +1,150 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - CohereForAI/c4ai-command-r-plus
4
+ library_name: transformers
5
+ tags:
6
+ - mergekit
7
+ - merge
8
+ language:
9
+ - en
10
+ - fr
11
+ - de
12
+ - es
13
+ - it
14
+ - pt
15
+ - ja
16
+ - ko
17
+ - zh
18
+ - ar
19
+ pipeline_tag: text-generation
20
+ license: cc-by-nc-4.0
21
+ ---
22
+ # Megac4ai-command-r-plus
23
+
24
+ ๐Ÿšจ **This model is created using the special mergekit that supports c4ai-command-r-plus.**
25
+
26
+ This is a 160b frankenmerge model created by interleaving layers of [CohereForAI/c4ai-command-r-plus](https://huggingface.co/CohereForAI/c4ai-command-r-plus) with itself using mergekit.
27
+
28
+ ## Output comparison
29
+ ### Test Case Details
30
+
31
+ Condition: temperature=0.3
32
+
33
+ ```
34
+ <|START_OF_TURN_TOKEN|><|USER_TOKEN|>ใƒ†ใ‚ฃใƒ : ใ‚„ใ‚ใ€่ชฟๅญใฏใฉใ†๏ผŸ
35
+ ใ‚ญใƒ : ใ„ใ‚ใ„ใ‚ใ‚„ใ‚ใ†ใจใ—ใฆใŸใ‚“ใ ใ‘ใฉใ€ใพใŸๅ…ˆๅปถใฐใ—ใซใ—ใกใ‚ƒใฃใŸใ‚ˆใ€‚
36
+ ใƒ†ใ‚ฃใƒ : ไฝ•ใ‚’ใ—ใ‚ˆใ†ใจใ—ใฆใ„ใŸใฎ๏ผŸ
37
+ ใ‚ญใƒ : ๅคงๅญฆใฎ่ชฒ้กŒใ ใ‚ˆใ€‚ใฉใ†ใซใ‚‚ใ‚„ใ‚‹ๆฐ—ใŒๅ‡บใชใใฆใญใ€‚
38
+ ใƒ†ใ‚ฃใƒ : ้›†ไธญใงใใชใ„ใชใ‚‰ใ€ใƒใƒขใƒ‰ใƒผใƒญใƒปใƒ†ใ‚ฏใƒ‹ใƒƒใ‚ฏใ‚’ใ™ใ‚‹ใจใ„ใ„ใ‚ˆใ€‚
39
+ ใ‚ญใƒ : ไฝ•ใใ‚Œ๏ผŸ
40
+ ใƒ†ใ‚ฃใƒ : 25ๅˆ†ไฝœๆฅญใ—ใฆใ€5ๅˆ†ไผ‘ๆ†ฉใ™ใ‚‹ใฎใ‚’็นฐใ‚Š่ฟ”ใ™ใ‚“ใ ใ‚ˆใ€‚ไธ€ๅ›žใ‚ใŸใ‚Šใฎไฝœๆฅญๆ™‚้–“ใŒ็Ÿญใใฆ้›†ไธญใงใใ‚‹ใ‚ˆใ€‚
41
+ ใ‚ญใƒ : ใ†ใƒผใ‚“ใ€้›†ไธญใฃใฆใ„ใ†ใ‚ใ‘ใ˜ใ‚ƒใชใ„ใ‚“ใ ใ‚ˆใญ
42
+ ใƒ†ใ‚ฃใƒ : ใ˜ใ‚ƒใ‚1ๆ—ฅใซ5ๅˆ†ใ ใ‘ใงใ„ใ„ใ‹ใ‚‰ๆœบใง่ชฒ้กŒใ‚’ใ™ใ‚‹ใฃใฆใ„ใ†ใฎใฏใฉใ†๏ผŸ
43
+ ใ‚ญใƒ : 5ๅˆ†ใ˜ใ‚ƒไฝ•ใ‚‚ใงใใชใใชใ„๏ผŸ
44
+ ใƒ†ใ‚ฃใƒ : ็Ÿญใ„ๆ™‚้–“ใงใ‚‚ใ„ใ„ใ‹ใ‚‰ๆœบใงไฝœๆฅญใ™ใ‚‹ใฃใฆใ„ใ†ใฎใŒใƒใ‚คใƒณใƒˆใชใ‚“ใ ใ‚ˆใ€‚ใ‚€ใ—ใ‚ใ‚‚ใฃใจใ‚„ใ‚ŠใŸใ„ใ€ใใ‚‰ใ„ใงๆญขใ‚ใฆใŠใใจๆฏŽๆ—ฅ็ถšใใ‚‚ใฃใจ้•ทใ„ๆ™‚้–“ใงใใ‚‹ใ‚ˆใ†ใซใชใ‚‹ใ‚ˆใ€‚
45
+ ใ‚ญใƒ : ็ขบใ‹ใซใใ‚Œใชใ‚‰ใงใใ‚‹ใ‹ใ‚‚ใ€‚ใ‚ใ‚ŠใŒใจใ†๏ผ
46
+ Q: ใ‚ญใƒ ใฏไฝ•ใ‚’ใ‚„ใฃใฆใฟใ‚ˆใ†ใจใ—ใฆใ„ใพใ™ใ‹๏ผŸใพใŸไฝ•ๆ•…ใใ‚Œใ‚’ใ‚„ใ‚ใ†ใจใ—ใฆใ„ใพใ™ใ‹๏ผŸ<|END_OF_TURN_TOKEN|><|START_OF_TURN_TOKEN|><|CHATBOT_TOKEN|>
47
+ ```
48
+
49
+ This task is included in [elyza/ELYZA-tasks-100](https://huggingface.co/datasets/elyza/ELYZA-tasks-100).
50
+
51
+ ### Output Example
52
+
53
+ | Model | Output |
54
+ |-------|--------|
55
+ | CohereForAI/c4ai-command-r-plus | ใ‚ญใƒ ใฏๅคงๅญฆใฎ่ชฒ้กŒใ‚’ใ‚„ใ‚ใ†ใจใ—ใฆใ„ใพใ™ใ€‚ใ—ใ‹ใ—ใ€ใ‚„ใ‚‹ๆฐ—ใŒ่ตทใใšใ€ๅ…ˆๅปถใฐใ—ใซใ—ใฆใ—ใพใฃใŸใ‚ˆใ†ใงใ™ใ€‚ |
56
+ | nitky/Megac4ai-command-r-plus | ใ‚ญใƒ ใฏๅคงๅญฆใฎ่ชฒ้กŒใ‚’ใ‚„ใ‚ใ†ใจใ—ใฆใ„ใพใ™ใ€‚ใ‚„ใ‚‹ๆฐ—ใŒๅ‡บใšใ€้›†ไธญใ—ใฆไฝœๆฅญใ™ใ‚‹ใ“ใจใŒใงใใชใ„ใŸใ‚ใ€ใƒ†ใ‚ฃใƒ ใซใ‚ขใƒ‰ใƒใ‚คใ‚นใ‚’ๆฑ‚ใ‚ใฆใ„ใพใ™ใ€‚ใƒ†ใ‚ฃใƒ ใŒๆๆกˆใ—ใŸใƒใƒขใƒ‰ใƒผใƒญใƒปใƒ†ใ‚ฏใƒ‹ใƒƒใ‚ฏใ‚„ใ€1ๆ—ฅใซ5ๅˆ†ใ ใ‘ๆœบใง่ชฒ้กŒใ‚’ใ™ใ‚‹ใจใ„ใ†ๆ–นๆณ•ใ‚’่ฉฆใ™ใ“ใจใงใ€่ชฒ้กŒใซๅ–ใ‚Š็ต„ใ‚€็ฟ’ๆ…ฃใ‚’่บซใซใคใ‘ใ‚ˆใ†ใจใ—ใฆใ„ใพใ™ใ€‚ |
57
+
58
+ ## Test environment
59
+
60
+ This model was tested using [text-generation-webui](https://github.com/oobabooga/text-generation-webui/tree/main). I use preset `min_p` and `Null preset` with temperature=0.3 for Generation.
61
+
62
+ ## Usage
63
+
64
+ Please install `transformers` from the source repository that includes the necessary changes for this model.
65
+ ```python
66
+ # pip install 'git+https://github.com/huggingface/transformers.git'
67
+ from transformers import AutoTokenizer, AutoModelForCausalLM
68
+
69
+ model_id = "nitky/megac4ai-command-r-plus"
70
+ tokenizer = AutoTokenizer.from_pretrained(model_id)
71
+ model = AutoModelForCausalLM.from_pretrained(model_id)
72
+
73
+ # Format message with the command-r-plus chat template
74
+ messages = [{"role": "user", "content": "Hello, how are you?"}]
75
+ input_ids = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt")
76
+ ## <BOS_TOKEN><|START_OF_TURN_TOKEN|><|USER_TOKEN|>Hello, how are you?<|END_OF_TURN_TOKEN|><|START_OF_TURN_TOKEN|><|CHATBOT_TOKEN|>
77
+
78
+ gen_tokens = model.generate(
79
+ input_ids,
80
+ max_new_tokens=100,
81
+ do_sample=True,
82
+ temperature=0.3,
83
+ )
84
+
85
+ gen_text = tokenizer.decode(gen_tokens[0])
86
+ print(gen_text)
87
+ ```
88
+
89
+ ### Quantized model through bitsandbytes, 4-bit precision
90
+
91
+ ```python
92
+ # pip install 'git+https://github.com/huggingface/transformers.git' bitsandbytes accelerate
93
+ from transformers import AutoTokenizer, AutoModelForCausalLM, BitsAndBytesConfig
94
+
95
+ bnb_config = BitsAndBytesConfig(load_in_4bit=True)
96
+
97
+ model_id = "nitky/megac4ai-command-r-plus"
98
+ tokenizer = AutoTokenizer.from_pretrained(model_id)
99
+ model = AutoModelForCausalLM.from_pretrained(model_id, quantization_config=bnb_config)
100
+
101
+ # Format message with the command-r-plus chat template
102
+ messages = [{"role": "user", "content": "Hello, how are you?"}]
103
+ input_ids = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt")
104
+ ## <BOS_TOKEN><|START_OF_TURN_TOKEN|><|USER_TOKEN|>Hello, how are you?<|END_OF_TURN_TOKEN|><|START_OF_TURN_TOKEN|><|CHATBOT_TOKEN|>
105
+
106
+ gen_tokens = model.generate(
107
+ input_ids,
108
+ max_new_tokens=100,
109
+ do_sample=True,
110
+ temperature=0.3,
111
+ )
112
+
113
+ gen_text = tokenizer.decode(gen_tokens[0])
114
+ print(gen_text)
115
+ ```
116
+
117
+ ## Merge Details
118
+ ### Merge Method
119
+
120
+ This model was merged using the passthrough merge method.
121
+
122
+ ### Models Merged
123
+
124
+ The following models were included in the merge:
125
+ * [CohereForAI/c4ai-command-r-plus](https://huggingface.co/CohereForAI/c4ai-command-r-plus)
126
+
127
+ ### Configuration
128
+
129
+ The following YAML configuration was used to produce this model:
130
+
131
+ ```yaml
132
+ dtype: float16
133
+ merge_method: passthrough
134
+ slices:
135
+ - sources:
136
+ - layer_range: [0, 20]
137
+ model: CohereForAI/c4ai-command-r-plus
138
+ - sources:
139
+ - layer_range: [11, 31]
140
+ model: CohereForAI/c4ai-command-r-plus
141
+ - sources:
142
+ - layer_range: [22, 42]
143
+ model: CohereForAI/c4ai-command-r-plus
144
+ - sources:
145
+ - layer_range: [33, 53]
146
+ model: CohereForAI/c4ai-command-r-plus
147
+ - sources:
148
+ - layer_range: [44, 64]
149
+ model: CohereForAI/c4ai-command-r-plus
150
+ ```