nitky commited on
Commit
2abfb9d
1 Parent(s): 40ed737

Upload 41 files

Browse files
Files changed (42) hide show
  1. .gitattributes +1 -0
  2. README.md +145 -0
  3. config.json +29 -0
  4. mergekit_config.yml +18 -0
  5. model-00001-of-00034.safetensors +3 -0
  6. model-00002-of-00034.safetensors +3 -0
  7. model-00003-of-00034.safetensors +3 -0
  8. model-00004-of-00034.safetensors +3 -0
  9. model-00005-of-00034.safetensors +3 -0
  10. model-00006-of-00034.safetensors +3 -0
  11. model-00007-of-00034.safetensors +3 -0
  12. model-00008-of-00034.safetensors +3 -0
  13. model-00009-of-00034.safetensors +3 -0
  14. model-00010-of-00034.safetensors +3 -0
  15. model-00011-of-00034.safetensors +3 -0
  16. model-00012-of-00034.safetensors +3 -0
  17. model-00013-of-00034.safetensors +3 -0
  18. model-00014-of-00034.safetensors +3 -0
  19. model-00015-of-00034.safetensors +3 -0
  20. model-00016-of-00034.safetensors +3 -0
  21. model-00017-of-00034.safetensors +3 -0
  22. model-00018-of-00034.safetensors +3 -0
  23. model-00019-of-00034.safetensors +3 -0
  24. model-00020-of-00034.safetensors +3 -0
  25. model-00021-of-00034.safetensors +3 -0
  26. model-00022-of-00034.safetensors +3 -0
  27. model-00023-of-00034.safetensors +3 -0
  28. model-00024-of-00034.safetensors +3 -0
  29. model-00025-of-00034.safetensors +3 -0
  30. model-00026-of-00034.safetensors +3 -0
  31. model-00027-of-00034.safetensors +3 -0
  32. model-00028-of-00034.safetensors +3 -0
  33. model-00029-of-00034.safetensors +3 -0
  34. model-00030-of-00034.safetensors +3 -0
  35. model-00031-of-00034.safetensors +3 -0
  36. model-00032-of-00034.safetensors +3 -0
  37. model-00033-of-00034.safetensors +3 -0
  38. model-00034-of-00034.safetensors +3 -0
  39. model.safetensors.index.json +1 -0
  40. special_tokens_map.json +23 -0
  41. tokenizer.json +3 -0
  42. tokenizer_config.json +330 -0
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ tokenizer.json filter=lfs diff=lfs merge=lfs -text
README.md CHANGED
@@ -1,3 +1,148 @@
1
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2
  license: cc-by-nc-4.0
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ base_model:
3
+ - CohereForAI/c4ai-command-r-plus
4
+ library_name: transformers
5
+ tags:
6
+ - mergekit
7
+ - merge
8
+ language:
9
+ - en
10
+ - fr
11
+ - de
12
+ - es
13
+ - it
14
+ - pt
15
+ - ja
16
+ - ko
17
+ - zh
18
+ - ar
19
+ pipeline_tag: text-generation
20
  license: cc-by-nc-4.0
21
  ---
22
+ # Megac4ai-command-r-plus
23
+
24
+ 🚨 **This model is created using the special mergekit that supports c4ai-command-r-plus.**
25
+
26
+ ## Output comparison
27
+ ### Test Case Details
28
+
29
+ Condition: `Null preset` with temperature=0.3
30
+
31
+ ```
32
+ <|START_OF_TURN_TOKEN|><|USER_TOKEN|>ティム: やあ、調子はどう?
33
+ キム: いろいろやろうとしてたんだけど、また先延ばしにしちゃったよ。
34
+ ティム: 何をしようとしていたの?
35
+ キム: 大学の課題だよ。どうにもやる気が出なくてね。
36
+ ティム: 集中できないなら、ポモドーロ・テクニックをするといいよ。
37
+ キム: 何それ?
38
+ ティム: 25分作業して、5分休憩するのを繰り返すんだよ。一回あたりの作業時間が短くて集中できるよ。
39
+ キム: うーん、集中っていうわけじゃないんだよね
40
+ ティム: じゃあ1日に5分だけでいいから机で課題をするっていうのはどう?
41
+ キム: 5分じゃ何もできなくない?
42
+ ティム: 短い時間でもいいから机で作業するっていうのがポイントなんだよ。むしろもっとやりたい、くらいで止めておくと毎日続くもっと長い時間できるようになるよ。
43
+ キム: 確かにそれならできるかも。ありがとう!
44
+ Q: キムは何をやってみようとしていますか?また何故それをやろうとしていますか?<|END_OF_TURN_TOKEN|><|START_OF_TURN_TOKEN|><|CHATBOT_TOKEN|>
45
+ ```
46
+
47
+ This task is included in [elyza/ELYZA-tasks-100](https://huggingface.co/datasets/elyza/ELYZA-tasks-100).
48
+
49
+ ### Output Example
50
+
51
+ | Model | Output |
52
+ |-------|--------|
53
+ | c4ai-command-r-plus | キムは大学の課題をやろうとしています。しかし、やる気が起きず、先延ばしにしてしまったようです。 |
54
+ | Megac4ai-command-r-plus | キムは大学の課題をやろうとしています。やる気が出ず、集中して作業することができないため、ティムにアドバイスを求めています。ティムが提案したポモドーロ・テクニックや、1日に5分だけ机で課題をするという方法を試すことで、課題に取り組む習慣を身につけようとしています。 |
55
+
56
+ ## Test environment
57
+
58
+ This model was tested using [text-generation-webui](https://github.com/oobabooga/text-generation-webui/tree/main). I use preset `min_p` and `Null preset` with temperature=0.3 for Generation.
59
+
60
+ ## Usage
61
+
62
+ Please install `transformers` from the source repository that includes the necessary changes for this model.
63
+ ```python
64
+ # pip install 'git+https://github.com/huggingface/transformers.git'
65
+ from transformers import AutoTokenizer, AutoModelForCausalLM
66
+
67
+ model_id = "nitky/megac4ai-command-r-plus"
68
+ tokenizer = AutoTokenizer.from_pretrained(model_id)
69
+ model = AutoModelForCausalLM.from_pretrained(model_id)
70
+
71
+ # Format message with the command-r-plus chat template
72
+ messages = [{"role": "user", "content": "Hello, how are you?"}]
73
+ input_ids = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt")
74
+ ## <BOS_TOKEN><|START_OF_TURN_TOKEN|><|USER_TOKEN|>Hello, how are you?<|END_OF_TURN_TOKEN|><|START_OF_TURN_TOKEN|><|CHATBOT_TOKEN|>
75
+
76
+ gen_tokens = model.generate(
77
+ input_ids,
78
+ max_new_tokens=100,
79
+ do_sample=True,
80
+ temperature=0.3,
81
+ )
82
+
83
+ gen_text = tokenizer.decode(gen_tokens[0])
84
+ print(gen_text)
85
+ ```
86
+
87
+ ### Quantized model through bitsandbytes, 4-bit precision
88
+
89
+ ```python
90
+ # pip install 'git+https://github.com/huggingface/transformers.git' bitsandbytes accelerate
91
+ from transformers import AutoTokenizer, AutoModelForCausalLM, BitsAndBytesConfig
92
+
93
+ bnb_config = BitsAndBytesConfig(load_in_4bit=True)
94
+
95
+ model_id = "nitky/megac4ai-command-r-plus"
96
+ tokenizer = AutoTokenizer.from_pretrained(model_id)
97
+ model = AutoModelForCausalLM.from_pretrained(model_id, quantization_config=bnb_config)
98
+
99
+ # Format message with the command-r-plus chat template
100
+ messages = [{"role": "user", "content": "Hello, how are you?"}]
101
+ input_ids = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt")
102
+ ## <BOS_TOKEN><|START_OF_TURN_TOKEN|><|USER_TOKEN|>Hello, how are you?<|END_OF_TURN_TOKEN|><|START_OF_TURN_TOKEN|><|CHATBOT_TOKEN|>
103
+
104
+ gen_tokens = model.generate(
105
+ input_ids,
106
+ max_new_tokens=100,
107
+ do_sample=True,
108
+ temperature=0.3,
109
+ )
110
+
111
+ gen_text = tokenizer.decode(gen_tokens[0])
112
+ print(gen_text)
113
+ ```
114
+
115
+ ## Merge Details
116
+ ### Merge Method
117
+
118
+ This model was merged using the passthrough merge method.
119
+
120
+ ### Models Merged
121
+
122
+ The following models were included in the merge:
123
+ * [CohereForAI/c4ai-command-r-plus](https://huggingface.co/CohereForAI/c4ai-command-r-plus)
124
+
125
+ ### Configuration
126
+
127
+ The following YAML configuration was used to produce this model:
128
+
129
+ ```yaml
130
+ dtype: float16
131
+ merge_method: passthrough
132
+ slices:
133
+ - sources:
134
+ - layer_range: [0, 20]
135
+ model: CohereForAI/c4ai-command-r-plus
136
+ - sources:
137
+ - layer_range: [11, 31]
138
+ model: CohereForAI/c4ai-command-r-plus
139
+ - sources:
140
+ - layer_range: [22, 42]
141
+ model: CohereForAI/c4ai-command-r-plus
142
+ - sources:
143
+ - layer_range: [33, 53]
144
+ model: CohereForAI/c4ai-command-r-plus
145
+ - sources:
146
+ - layer_range: [44, 64]
147
+ model: CohereForAI/c4ai-command-r-plus
148
+ ```
config.json ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "nitky/megac4ai-command-r-plus",
3
+ "architectures": [
4
+ "CohereForCausalLM"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 5,
9
+ "eos_token_id": 255001,
10
+ "hidden_act": "silu",
11
+ "hidden_size": 12288,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 33792,
14
+ "layer_norm_eps": 1e-05,
15
+ "logit_scale": 0.8333333333333334,
16
+ "max_position_embeddings": 8192,
17
+ "model_max_length": 131072,
18
+ "model_type": "cohere",
19
+ "num_attention_heads": 96,
20
+ "num_hidden_layers": 100,
21
+ "num_key_value_heads": 8,
22
+ "pad_token_id": 0,
23
+ "rope_theta": 75000000.0,
24
+ "torch_dtype": "float16",
25
+ "transformers_version": "4.40.0.dev0",
26
+ "use_cache": true,
27
+ "use_qk_norm": true,
28
+ "vocab_size": 256000
29
+ }
mergekit_config.yml ADDED
@@ -0,0 +1,18 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ dtype: float16
2
+ merge_method: passthrough
3
+ slices:
4
+ - sources:
5
+ - layer_range: [0, 20]
6
+ model: CohereForAI/c4ai-command-r-plus
7
+ - sources:
8
+ - layer_range: [11, 31]
9
+ model: CohereForAI/c4ai-command-r-plus
10
+ - sources:
11
+ - layer_range: [22, 42]
12
+ model: CohereForAI/c4ai-command-r-plus
13
+ - sources:
14
+ - layer_range: [33, 53]
15
+ model: CohereForAI/c4ai-command-r-plus
16
+ - sources:
17
+ - layer_range: [44, 64]
18
+ model: CohereForAI/c4ai-command-r-plus
model-00001-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b1818228d6180f41f5dc431c169a48f3991c9d73a6af739038db29138d6efd75
3
+ size 9261078240
model-00002-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:21011e93f9c9303f5df20a5ccb82cd85227120d76c025f5dbd08b630e197a79e
3
+ size 9437341072
model-00003-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:49812e4bc82fc5df2712f4b0aa27dc39632cd6a0fa54216bcec628be5193a6b8
3
+ size 9437341072
model-00004-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fcf8129f20bdf78011e754fc1ef061c94baf732cbae63831ad0c467b65073173
3
+ size 9437341080
model-00005-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f9091e1d09d398d2a9409132039364af0603357af4cf0086cec91b40788ea6c8
3
+ size 9915543056
model-00006-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a20298c7648244d68e057c8780da1b8fb68e61dc9042f47beda5e4b2abf728ac
3
+ size 9965830960
model-00007-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c2faeaa824a764fa46cab5d8dd042e5a23572f86fb7534c626af78b915799011
3
+ size 9915538728
model-00008-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3a1a11d6a92f78a172ac28ed498a5b212930bc76666d4f64797ec5da6cf004ac
3
+ size 9261182856
model-00009-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c755f54ec6a44417c09b9e20b6892b7a7764232643fcdc972cb5db49faac16fa
3
+ size 9613499344
model-00010-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3519761c938e822f4c62e27c3f4edc1e568057caf9c6a330e8d7edc401d04536
3
+ size 9437289560
model-00011-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f3840283646930e228c6f24006f9fb50647fb13c748b4c8e955f8b2231070a42
3
+ size 9437314248
model-00012-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9638bc51b53365aeeea4fc8a26ee513b4b444b16998f9e246c38b315ce191332
3
+ size 9613528360
model-00013-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:62646c48429201619660c182d5b648129e5d5fe20fc686a2549068ee67d090ed
3
+ size 9261129152
model-00014-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:581a1204b1d863f5884aed9b8f293bcb81dd52b7b6b975609b1e4517fe41ff6f
3
+ size 9613553048
model-00015-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:589471b01aa4f65c277e45c53328eb206b7a1e671f9889e359b5c5152e3119c4
3
+ size 9261129152
model-00016-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d4e60931390e339ff4d9f8405f5d54d1fb937c691ea6f21519d3eef6a0a54cf6
3
+ size 9613553048
model-00017-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2a0422c2dd2dc7fd5af724f9e7ec9e648d0983763d956cd81cff59da1db7eedd
3
+ size 9789689840
model-00018-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:46518e93617cb7ec8a9782b0fffc8c8f6d3cb487b20a4677beb134f6a0cae51c
3
+ size 9563143848
model-00019-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:84e0583687ecf301fa40319b8cff478f951f780021ce1a6b0c960fc1dfbb0884
3
+ size 9613503672
model-00020-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:03ef9293a92112f3884892587375112068bea602a6115f3818ceca68297a67ef
3
+ size 9261178528
model-00021-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ab3e3f61ed159fd47f264ab0b68e0cecd20f6b76710efff5c311b60a6887ee9e
3
+ size 9613503672
model-00022-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:348e878109bea143b93709ff2468973310d6aa70e08cd943a50792d83eabd2c5
3
+ size 9261178528
model-00023-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c8990f85b848eb6ca25205f3f467e93242b99ffd82902670c0318f5f8af14764
3
+ size 9613476824
model-00024-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:58d95c21a69c823f718bc3eb70ebdc7a734dfb5327a97163cd762e3713a381eb
3
+ size 9789716688
model-00025-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1798fc2a7de025666b66197fde077040e7c0614f8fe49f54283f26376d572cc5
3
+ size 9915514048
model-00026-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ff0a9b73327b47a5f42d6940a22d478d5ba44c7d1cc413a433e9f5aba7a9e38b
3
+ size 9261182848
model-00027-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:93a99e2d29351e754a4d98a5c3b2037aa3535ce34295927c2265566212e9ae02
3
+ size 9915489352
model-00028-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c663e1b0c364d375d5f63871454582eb95cd15935c68de0fd8dac2d0434146b5
3
+ size 9789665152
model-00029-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bf89b6e7d27392342b031901bdc12038db5ac90a3d2b1c0a48b47ed351e2154f
3
+ size 9789660832
model-00030-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:63c8fa90f888ee815c6fbab83fba620d2f630fd0814843c6eec2578e2e50febc
3
+ size 9261152768
model-00031-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f4a87ddd144fd5e75552ddfea60aa6966e6fcd4a6662cae1fa3ac186c85f6e97
3
+ size 9437341096
model-00032-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fa7de16b8ea1c8f3fcf8b3369a262aea74d2874a35e70c0eca21fe7c3c27364e
3
+ size 9437341096
model-00033-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a9d282eaef70c7727ffda7b774dcd734de5353f96abdc874f509bb93bc356329
3
+ size 9437341096
model-00034-of-00034.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1c44bc9f7b54d8a3abe4a69bfdb9dc2949d1192736890633b5f847344b5692aa
3
+ size 5637244496
model.safetensors.index.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"metadata": {"mergekit_version": "0.0.4.2", "total_size": 320869400576}, "weight_map": {"model.embed_tokens.weight": "model-00001-of-00034.safetensors", "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00034.safetensors", "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00034.safetensors", "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00034.safetensors", "model.layers.1.self_attn.k_norm.weight": "model-00001-of-00034.safetensors", "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00034.safetensors", "model.layers.1.self_attn.q_norm.weight": "model-00001-of-00034.safetensors", "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00034.safetensors", "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00034.safetensors", "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00034.safetensors", "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00034.safetensors", "model.layers.0.self_attn.k_norm.weight": "model-00001-of-00034.safetensors", "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00034.safetensors", "model.layers.0.self_attn.q_norm.weight": "model-00001-of-00034.safetensors", "model.layers.0.mlp.up_proj.weight": "model-00001-of-00034.safetensors", "model.layers.0.mlp.gate_proj.weight": "model-00002-of-00034.safetensors", "model.layers.0.mlp.down_proj.weight": "model-00002-of-00034.safetensors", "model.layers.0.input_layernorm.weight": "model-00002-of-00034.safetensors", "model.layers.3.self_attn.k_norm.weight": "model-00002-of-00034.safetensors", "model.layers.3.self_attn.q_norm.weight": "model-00002-of-00034.safetensors", "model.layers.2.self_attn.o_proj.weight": "model-00002-of-00034.safetensors", "model.layers.2.self_attn.v_proj.weight": "model-00002-of-00034.safetensors", "model.layers.2.self_attn.k_proj.weight": "model-00002-of-00034.safetensors", "model.layers.2.self_attn.k_norm.weight": "model-00002-of-00034.safetensors", "model.layers.2.self_attn.q_proj.weight": "model-00002-of-00034.safetensors", "model.layers.2.self_attn.q_norm.weight": "model-00002-of-00034.safetensors", "model.layers.2.mlp.up_proj.weight": "model-00002-of-00034.safetensors", "model.layers.2.mlp.gate_proj.weight": "model-00002-of-00034.safetensors", "model.layers.2.mlp.down_proj.weight": "model-00002-of-00034.safetensors", "model.layers.2.input_layernorm.weight": "model-00002-of-00034.safetensors", "model.layers.1.mlp.up_proj.weight": "model-00002-of-00034.safetensors", "model.layers.1.mlp.down_proj.weight": "model-00002-of-00034.safetensors", "model.layers.1.input_layernorm.weight": "model-00002-of-00034.safetensors", "model.layers.4.self_attn.o_proj.weight": "model-00002-of-00034.safetensors", "model.layers.4.self_attn.v_proj.weight": "model-00002-of-00034.safetensors", "model.layers.4.self_attn.k_proj.weight": "model-00002-of-00034.safetensors", "model.layers.4.self_attn.k_norm.weight": "model-00002-of-00034.safetensors", "model.layers.4.self_attn.q_proj.weight": "model-00002-of-00034.safetensors", "model.layers.4.self_attn.q_norm.weight": "model-00002-of-00034.safetensors", "model.layers.4.mlp.gate_proj.weight": "model-00002-of-00034.safetensors", "model.layers.3.self_attn.o_proj.weight": "model-00002-of-00034.safetensors", "model.layers.3.self_attn.v_proj.weight": "model-00002-of-00034.safetensors", "model.layers.3.self_attn.k_proj.weight": "model-00002-of-00034.safetensors", "model.layers.3.self_attn.q_proj.weight": "model-00002-of-00034.safetensors", "model.layers.3.mlp.up_proj.weight": "model-00002-of-00034.safetensors", "model.layers.3.mlp.gate_proj.weight": "model-00003-of-00034.safetensors", "model.layers.3.mlp.down_proj.weight": "model-00003-of-00034.safetensors", "model.layers.3.input_layernorm.weight": "model-00003-of-00034.safetensors", "model.layers.6.self_attn.k_norm.weight": "model-00003-of-00034.safetensors", "model.layers.6.self_attn.q_norm.weight": "model-00003-of-00034.safetensors", "model.layers.5.self_attn.o_proj.weight": "model-00003-of-00034.safetensors", "model.layers.5.self_attn.v_proj.weight": "model-00003-of-00034.safetensors", "model.layers.5.self_attn.k_proj.weight": "model-00003-of-00034.safetensors", "model.layers.5.self_attn.k_norm.weight": "model-00003-of-00034.safetensors", "model.layers.5.self_attn.q_proj.weight": "model-00003-of-00034.safetensors", "model.layers.5.self_attn.q_norm.weight": "model-00003-of-00034.safetensors", "model.layers.5.mlp.up_proj.weight": "model-00003-of-00034.safetensors", "model.layers.5.mlp.gate_proj.weight": "model-00003-of-00034.safetensors", "model.layers.5.mlp.down_proj.weight": "model-00003-of-00034.safetensors", "model.layers.5.input_layernorm.weight": "model-00003-of-00034.safetensors", "model.layers.4.mlp.up_proj.weight": "model-00003-of-00034.safetensors", "model.layers.4.mlp.down_proj.weight": "model-00003-of-00034.safetensors", "model.layers.4.input_layernorm.weight": "model-00003-of-00034.safetensors", "model.layers.7.self_attn.o_proj.weight": "model-00003-of-00034.safetensors", "model.layers.7.self_attn.v_proj.weight": "model-00003-of-00034.safetensors", "model.layers.7.self_attn.k_proj.weight": "model-00003-of-00034.safetensors", "model.layers.7.self_attn.k_norm.weight": "model-00003-of-00034.safetensors", "model.layers.7.self_attn.q_proj.weight": "model-00003-of-00034.safetensors", "model.layers.7.self_attn.q_norm.weight": "model-00003-of-00034.safetensors", "model.layers.7.mlp.gate_proj.weight": "model-00003-of-00034.safetensors", "model.layers.6.self_attn.o_proj.weight": "model-00003-of-00034.safetensors", "model.layers.6.self_attn.v_proj.weight": "model-00003-of-00034.safetensors", "model.layers.6.self_attn.k_proj.weight": "model-00003-of-00034.safetensors", "model.layers.6.self_attn.q_proj.weight": "model-00003-of-00034.safetensors", "model.layers.6.mlp.up_proj.weight": "model-00003-of-00034.safetensors", "model.layers.6.mlp.gate_proj.weight": "model-00004-of-00034.safetensors", "model.layers.6.mlp.down_proj.weight": "model-00004-of-00034.safetensors", "model.layers.6.input_layernorm.weight": "model-00004-of-00034.safetensors", "model.layers.9.self_attn.k_norm.weight": "model-00004-of-00034.safetensors", "model.layers.9.self_attn.q_norm.weight": "model-00004-of-00034.safetensors", "model.layers.8.self_attn.o_proj.weight": "model-00004-of-00034.safetensors", "model.layers.8.self_attn.v_proj.weight": "model-00004-of-00034.safetensors", "model.layers.8.self_attn.k_proj.weight": "model-00004-of-00034.safetensors", "model.layers.8.self_attn.k_norm.weight": "model-00004-of-00034.safetensors", "model.layers.8.self_attn.q_proj.weight": "model-00004-of-00034.safetensors", "model.layers.8.self_attn.q_norm.weight": "model-00004-of-00034.safetensors", "model.layers.8.mlp.up_proj.weight": "model-00004-of-00034.safetensors", "model.layers.8.mlp.gate_proj.weight": "model-00004-of-00034.safetensors", "model.layers.8.mlp.down_proj.weight": "model-00004-of-00034.safetensors", "model.layers.8.input_layernorm.weight": "model-00004-of-00034.safetensors", "model.layers.7.mlp.up_proj.weight": "model-00004-of-00034.safetensors", "model.layers.7.mlp.down_proj.weight": "model-00004-of-00034.safetensors", "model.layers.7.input_layernorm.weight": "model-00004-of-00034.safetensors", "model.layers.10.self_attn.o_proj.weight": "model-00004-of-00034.safetensors", "model.layers.10.self_attn.v_proj.weight": "model-00004-of-00034.safetensors", "model.layers.10.self_attn.k_proj.weight": "model-00004-of-00034.safetensors", "model.layers.10.self_attn.k_norm.weight": "model-00004-of-00034.safetensors", "model.layers.10.self_attn.q_proj.weight": "model-00004-of-00034.safetensors", "model.layers.10.self_attn.q_norm.weight": "model-00004-of-00034.safetensors", "model.layers.10.mlp.gate_proj.weight": "model-00004-of-00034.safetensors", "model.layers.9.self_attn.o_proj.weight": "model-00004-of-00034.safetensors", "model.layers.9.self_attn.v_proj.weight": "model-00004-of-00034.safetensors", "model.layers.9.self_attn.k_proj.weight": "model-00004-of-00034.safetensors", "model.layers.9.self_attn.q_proj.weight": "model-00004-of-00034.safetensors", "model.layers.9.mlp.up_proj.weight": "model-00004-of-00034.safetensors", "model.layers.9.mlp.gate_proj.weight": "model-00005-of-00034.safetensors", "model.layers.9.mlp.down_proj.weight": "model-00005-of-00034.safetensors", "model.layers.9.input_layernorm.weight": "model-00005-of-00034.safetensors", "model.layers.21.self_attn.k_norm.weight": "model-00005-of-00034.safetensors", "model.layers.12.self_attn.k_norm.weight": "model-00005-of-00034.safetensors", "model.layers.21.self_attn.q_norm.weight": "model-00005-of-00034.safetensors", "model.layers.12.self_attn.q_norm.weight": "model-00005-of-00034.safetensors", "model.layers.20.self_attn.o_proj.weight": "model-00005-of-00034.safetensors", "model.layers.11.self_attn.o_proj.weight": "model-00005-of-00034.safetensors", "model.layers.20.self_attn.v_proj.weight": "model-00005-of-00034.safetensors", "model.layers.11.self_attn.v_proj.weight": "model-00005-of-00034.safetensors", "model.layers.20.self_attn.k_proj.weight": "model-00005-of-00034.safetensors", "model.layers.11.self_attn.k_proj.weight": "model-00005-of-00034.safetensors", "model.layers.20.self_attn.k_norm.weight": "model-00005-of-00034.safetensors", "model.layers.11.self_attn.k_norm.weight": "model-00005-of-00034.safetensors", "model.layers.20.self_attn.q_proj.weight": "model-00005-of-00034.safetensors", "model.layers.11.self_attn.q_proj.weight": "model-00005-of-00034.safetensors", "model.layers.20.self_attn.q_norm.weight": "model-00005-of-00034.safetensors", "model.layers.11.self_attn.q_norm.weight": "model-00005-of-00034.safetensors", "model.layers.20.mlp.up_proj.weight": "model-00005-of-00034.safetensors", "model.layers.11.mlp.up_proj.weight": "model-00005-of-00034.safetensors", "model.layers.20.mlp.gate_proj.weight": "model-00005-of-00034.safetensors", "model.layers.11.mlp.gate_proj.weight": "model-00005-of-00034.safetensors", "model.layers.20.mlp.down_proj.weight": "model-00005-of-00034.safetensors", "model.layers.11.mlp.down_proj.weight": "model-00005-of-00034.safetensors", "model.layers.20.input_layernorm.weight": "model-00005-of-00034.safetensors", "model.layers.11.input_layernorm.weight": "model-00005-of-00034.safetensors", "model.layers.10.mlp.up_proj.weight": "model-00005-of-00034.safetensors", "model.layers.10.mlp.down_proj.weight": "model-00005-of-00034.safetensors", "model.layers.10.input_layernorm.weight": "model-00005-of-00034.safetensors", "model.layers.22.self_attn.o_proj.weight": "model-00005-of-00034.safetensors", "model.layers.13.self_attn.o_proj.weight": "model-00006-of-00034.safetensors", "model.layers.22.self_attn.v_proj.weight": "model-00006-of-00034.safetensors", "model.layers.13.self_attn.v_proj.weight": "model-00006-of-00034.safetensors", "model.layers.22.self_attn.k_proj.weight": "model-00006-of-00034.safetensors", "model.layers.13.self_attn.k_proj.weight": "model-00006-of-00034.safetensors", "model.layers.22.self_attn.k_norm.weight": "model-00006-of-00034.safetensors", "model.layers.13.self_attn.k_norm.weight": "model-00006-of-00034.safetensors", "model.layers.22.self_attn.q_proj.weight": "model-00006-of-00034.safetensors", "model.layers.13.self_attn.q_proj.weight": "model-00006-of-00034.safetensors", "model.layers.22.self_attn.q_norm.weight": "model-00006-of-00034.safetensors", "model.layers.13.self_attn.q_norm.weight": "model-00006-of-00034.safetensors", "model.layers.22.mlp.gate_proj.weight": "model-00006-of-00034.safetensors", "model.layers.13.mlp.gate_proj.weight": "model-00006-of-00034.safetensors", "model.layers.21.self_attn.o_proj.weight": "model-00006-of-00034.safetensors", "model.layers.12.self_attn.o_proj.weight": "model-00006-of-00034.safetensors", "model.layers.21.self_attn.v_proj.weight": "model-00006-of-00034.safetensors", "model.layers.12.self_attn.v_proj.weight": "model-00006-of-00034.safetensors", "model.layers.21.self_attn.k_proj.weight": "model-00006-of-00034.safetensors", "model.layers.12.self_attn.k_proj.weight": "model-00006-of-00034.safetensors", "model.layers.21.self_attn.q_proj.weight": "model-00006-of-00034.safetensors", "model.layers.12.self_attn.q_proj.weight": "model-00006-of-00034.safetensors", "model.layers.21.mlp.up_proj.weight": "model-00006-of-00034.safetensors", "model.layers.12.mlp.up_proj.weight": "model-00006-of-00034.safetensors", "model.layers.21.mlp.gate_proj.weight": "model-00006-of-00034.safetensors", "model.layers.12.mlp.gate_proj.weight": "model-00006-of-00034.safetensors", "model.layers.21.mlp.down_proj.weight": "model-00006-of-00034.safetensors", "model.layers.12.mlp.down_proj.weight": "model-00006-of-00034.safetensors", "model.layers.21.input_layernorm.weight": "model-00006-of-00034.safetensors", "model.layers.12.input_layernorm.weight": "model-00006-of-00034.safetensors", "model.layers.24.self_attn.k_norm.weight": "model-00006-of-00034.safetensors", "model.layers.15.self_attn.k_norm.weight": "model-00006-of-00034.safetensors", "model.layers.24.self_attn.q_norm.weight": "model-00006-of-00034.safetensors", "model.layers.15.self_attn.q_norm.weight": "model-00006-of-00034.safetensors", "model.layers.23.self_attn.o_proj.weight": "model-00006-of-00034.safetensors", "model.layers.14.self_attn.o_proj.weight": "model-00006-of-00034.safetensors", "model.layers.23.self_attn.v_proj.weight": "model-00006-of-00034.safetensors", "model.layers.14.self_attn.v_proj.weight": "model-00006-of-00034.safetensors", "model.layers.23.self_attn.k_proj.weight": "model-00006-of-00034.safetensors", "model.layers.14.self_attn.k_proj.weight": "model-00006-of-00034.safetensors", "model.layers.23.self_attn.k_norm.weight": "model-00006-of-00034.safetensors", "model.layers.14.self_attn.k_norm.weight": "model-00006-of-00034.safetensors", "model.layers.23.self_attn.q_proj.weight": "model-00006-of-00034.safetensors", "model.layers.14.self_attn.q_proj.weight": "model-00007-of-00034.safetensors", "model.layers.23.self_attn.q_norm.weight": "model-00007-of-00034.safetensors", "model.layers.14.self_attn.q_norm.weight": "model-00007-of-00034.safetensors", "model.layers.23.mlp.up_proj.weight": "model-00007-of-00034.safetensors", "model.layers.14.mlp.up_proj.weight": "model-00007-of-00034.safetensors", "model.layers.23.mlp.gate_proj.weight": "model-00007-of-00034.safetensors", "model.layers.14.mlp.gate_proj.weight": "model-00007-of-00034.safetensors", "model.layers.23.mlp.down_proj.weight": "model-00007-of-00034.safetensors", "model.layers.14.mlp.down_proj.weight": "model-00007-of-00034.safetensors", "model.layers.23.input_layernorm.weight": "model-00007-of-00034.safetensors", "model.layers.14.input_layernorm.weight": "model-00007-of-00034.safetensors", "model.layers.22.mlp.up_proj.weight": "model-00007-of-00034.safetensors", "model.layers.13.mlp.up_proj.weight": "model-00007-of-00034.safetensors", "model.layers.22.mlp.down_proj.weight": "model-00007-of-00034.safetensors", "model.layers.13.mlp.down_proj.weight": "model-00007-of-00034.safetensors", "model.layers.22.input_layernorm.weight": "model-00007-of-00034.safetensors", "model.layers.13.input_layernorm.weight": "model-00007-of-00034.safetensors", "model.layers.25.self_attn.o_proj.weight": "model-00007-of-00034.safetensors", "model.layers.16.self_attn.o_proj.weight": "model-00007-of-00034.safetensors", "model.layers.25.self_attn.v_proj.weight": "model-00007-of-00034.safetensors", "model.layers.16.self_attn.v_proj.weight": "model-00007-of-00034.safetensors", "model.layers.25.self_attn.k_proj.weight": "model-00007-of-00034.safetensors", "model.layers.16.self_attn.k_proj.weight": "model-00007-of-00034.safetensors", "model.layers.25.self_attn.k_norm.weight": "model-00007-of-00034.safetensors", "model.layers.16.self_attn.k_norm.weight": "model-00007-of-00034.safetensors", "model.layers.25.self_attn.q_proj.weight": "model-00007-of-00034.safetensors", "model.layers.16.self_attn.q_proj.weight": "model-00007-of-00034.safetensors", "model.layers.25.self_attn.q_norm.weight": "model-00007-of-00034.safetensors", "model.layers.16.self_attn.q_norm.weight": "model-00007-of-00034.safetensors", "model.layers.25.mlp.gate_proj.weight": "model-00008-of-00034.safetensors", "model.layers.16.mlp.gate_proj.weight": "model-00008-of-00034.safetensors", "model.layers.24.self_attn.o_proj.weight": "model-00008-of-00034.safetensors", "model.layers.15.self_attn.o_proj.weight": "model-00008-of-00034.safetensors", "model.layers.24.self_attn.v_proj.weight": "model-00008-of-00034.safetensors", "model.layers.15.self_attn.v_proj.weight": "model-00008-of-00034.safetensors", "model.layers.24.self_attn.k_proj.weight": "model-00008-of-00034.safetensors", "model.layers.15.self_attn.k_proj.weight": "model-00008-of-00034.safetensors", "model.layers.24.self_attn.q_proj.weight": "model-00008-of-00034.safetensors", "model.layers.15.self_attn.q_proj.weight": "model-00008-of-00034.safetensors", "model.layers.24.mlp.up_proj.weight": "model-00008-of-00034.safetensors", "model.layers.15.mlp.up_proj.weight": "model-00008-of-00034.safetensors", "model.layers.24.mlp.gate_proj.weight": "model-00008-of-00034.safetensors", "model.layers.15.mlp.gate_proj.weight": "model-00008-of-00034.safetensors", "model.layers.24.mlp.down_proj.weight": "model-00008-of-00034.safetensors", "model.layers.15.mlp.down_proj.weight": "model-00008-of-00034.safetensors", "model.layers.24.input_layernorm.weight": "model-00008-of-00034.safetensors", "model.layers.15.input_layernorm.weight": "model-00008-of-00034.safetensors", "model.layers.27.self_attn.k_norm.weight": "model-00008-of-00034.safetensors", "model.layers.18.self_attn.k_norm.weight": "model-00008-of-00034.safetensors", "model.layers.27.self_attn.q_norm.weight": "model-00008-of-00034.safetensors", "model.layers.18.self_attn.q_norm.weight": "model-00008-of-00034.safetensors", "model.layers.26.self_attn.o_proj.weight": "model-00008-of-00034.safetensors", "model.layers.17.self_attn.o_proj.weight": "model-00008-of-00034.safetensors", "model.layers.26.self_attn.v_proj.weight": "model-00008-of-00034.safetensors", "model.layers.17.self_attn.v_proj.weight": "model-00008-of-00034.safetensors", "model.layers.26.self_attn.k_proj.weight": "model-00008-of-00034.safetensors", "model.layers.17.self_attn.k_proj.weight": "model-00008-of-00034.safetensors", "model.layers.26.self_attn.k_norm.weight": "model-00008-of-00034.safetensors", "model.layers.17.self_attn.k_norm.weight": "model-00008-of-00034.safetensors", "model.layers.26.self_attn.q_proj.weight": "model-00008-of-00034.safetensors", "model.layers.17.self_attn.q_proj.weight": "model-00008-of-00034.safetensors", "model.layers.26.self_attn.q_norm.weight": "model-00008-of-00034.safetensors", "model.layers.17.self_attn.q_norm.weight": "model-00008-of-00034.safetensors", "model.layers.26.mlp.up_proj.weight": "model-00009-of-00034.safetensors", "model.layers.17.mlp.up_proj.weight": "model-00009-of-00034.safetensors", "model.layers.26.mlp.gate_proj.weight": "model-00009-of-00034.safetensors", "model.layers.17.mlp.gate_proj.weight": "model-00009-of-00034.safetensors", "model.layers.26.mlp.down_proj.weight": "model-00009-of-00034.safetensors", "model.layers.17.mlp.down_proj.weight": "model-00009-of-00034.safetensors", "model.layers.26.input_layernorm.weight": "model-00009-of-00034.safetensors", "model.layers.17.input_layernorm.weight": "model-00009-of-00034.safetensors", "model.layers.25.mlp.up_proj.weight": "model-00009-of-00034.safetensors", "model.layers.16.mlp.up_proj.weight": "model-00009-of-00034.safetensors", "model.layers.25.mlp.down_proj.weight": "model-00009-of-00034.safetensors", "model.layers.16.mlp.down_proj.weight": "model-00009-of-00034.safetensors", "model.layers.25.input_layernorm.weight": "model-00009-of-00034.safetensors", "model.layers.16.input_layernorm.weight": "model-00009-of-00034.safetensors", "model.layers.28.self_attn.o_proj.weight": "model-00009-of-00034.safetensors", "model.layers.19.self_attn.o_proj.weight": "model-00009-of-00034.safetensors", "model.layers.28.self_attn.v_proj.weight": "model-00009-of-00034.safetensors", "model.layers.19.self_attn.v_proj.weight": "model-00009-of-00034.safetensors", "model.layers.28.self_attn.k_proj.weight": "model-00009-of-00034.safetensors", "model.layers.19.self_attn.k_proj.weight": "model-00009-of-00034.safetensors", "model.layers.28.self_attn.k_norm.weight": "model-00009-of-00034.safetensors", "model.layers.19.self_attn.k_norm.weight": "model-00009-of-00034.safetensors", "model.layers.28.self_attn.q_proj.weight": "model-00009-of-00034.safetensors", "model.layers.19.self_attn.q_proj.weight": "model-00009-of-00034.safetensors", "model.layers.28.self_attn.q_norm.weight": "model-00009-of-00034.safetensors", "model.layers.19.self_attn.q_norm.weight": "model-00009-of-00034.safetensors", "model.layers.28.mlp.gate_proj.weight": "model-00010-of-00034.safetensors", "model.layers.19.mlp.gate_proj.weight": "model-00010-of-00034.safetensors", "model.layers.27.self_attn.o_proj.weight": "model-00010-of-00034.safetensors", "model.layers.18.self_attn.o_proj.weight": "model-00010-of-00034.safetensors", "model.layers.27.self_attn.v_proj.weight": "model-00010-of-00034.safetensors", "model.layers.18.self_attn.v_proj.weight": "model-00010-of-00034.safetensors", "model.layers.27.self_attn.k_proj.weight": "model-00010-of-00034.safetensors", "model.layers.18.self_attn.k_proj.weight": "model-00010-of-00034.safetensors", "model.layers.27.self_attn.q_proj.weight": "model-00010-of-00034.safetensors", "model.layers.18.self_attn.q_proj.weight": "model-00010-of-00034.safetensors", "model.layers.27.mlp.up_proj.weight": "model-00010-of-00034.safetensors", "model.layers.18.mlp.up_proj.weight": "model-00010-of-00034.safetensors", "model.layers.27.mlp.gate_proj.weight": "model-00010-of-00034.safetensors", "model.layers.18.mlp.gate_proj.weight": "model-00010-of-00034.safetensors", "model.layers.27.mlp.down_proj.weight": "model-00010-of-00034.safetensors", "model.layers.18.mlp.down_proj.weight": "model-00010-of-00034.safetensors", "model.layers.27.input_layernorm.weight": "model-00010-of-00034.safetensors", "model.layers.18.input_layernorm.weight": "model-00010-of-00034.safetensors", "model.layers.30.self_attn.k_norm.weight": "model-00010-of-00034.safetensors", "model.layers.30.self_attn.q_norm.weight": "model-00010-of-00034.safetensors", "model.layers.29.self_attn.o_proj.weight": "model-00010-of-00034.safetensors", "model.layers.29.self_attn.v_proj.weight": "model-00010-of-00034.safetensors", "model.layers.29.self_attn.k_proj.weight": "model-00010-of-00034.safetensors", "model.layers.29.self_attn.k_norm.weight": "model-00010-of-00034.safetensors", "model.layers.29.self_attn.q_proj.weight": "model-00010-of-00034.safetensors", "model.layers.29.self_attn.q_norm.weight": "model-00010-of-00034.safetensors", "model.layers.29.mlp.up_proj.weight": "model-00010-of-00034.safetensors", "model.layers.29.mlp.gate_proj.weight": "model-00011-of-00034.safetensors", "model.layers.29.mlp.down_proj.weight": "model-00011-of-00034.safetensors", "model.layers.29.input_layernorm.weight": "model-00011-of-00034.safetensors", "model.layers.28.mlp.up_proj.weight": "model-00011-of-00034.safetensors", "model.layers.19.mlp.up_proj.weight": "model-00011-of-00034.safetensors", "model.layers.28.mlp.down_proj.weight": "model-00011-of-00034.safetensors", "model.layers.19.mlp.down_proj.weight": "model-00011-of-00034.safetensors", "model.layers.28.input_layernorm.weight": "model-00011-of-00034.safetensors", "model.layers.19.input_layernorm.weight": "model-00011-of-00034.safetensors", "model.layers.40.self_attn.o_proj.weight": "model-00011-of-00034.safetensors", "model.layers.31.self_attn.o_proj.weight": "model-00011-of-00034.safetensors", "model.layers.40.self_attn.v_proj.weight": "model-00011-of-00034.safetensors", "model.layers.31.self_attn.v_proj.weight": "model-00011-of-00034.safetensors", "model.layers.40.self_attn.k_proj.weight": "model-00011-of-00034.safetensors", "model.layers.31.self_attn.k_proj.weight": "model-00011-of-00034.safetensors", "model.layers.40.self_attn.k_norm.weight": "model-00011-of-00034.safetensors", "model.layers.31.self_attn.k_norm.weight": "model-00011-of-00034.safetensors", "model.layers.40.self_attn.q_proj.weight": "model-00011-of-00034.safetensors", "model.layers.31.self_attn.q_proj.weight": "model-00011-of-00034.safetensors", "model.layers.40.self_attn.q_norm.weight": "model-00011-of-00034.safetensors", "model.layers.31.self_attn.q_norm.weight": "model-00011-of-00034.safetensors", "model.layers.40.mlp.gate_proj.weight": "model-00011-of-00034.safetensors", "model.layers.31.mlp.gate_proj.weight": "model-00011-of-00034.safetensors", "model.layers.30.self_attn.o_proj.weight": "model-00011-of-00034.safetensors", "model.layers.30.self_attn.v_proj.weight": "model-00011-of-00034.safetensors", "model.layers.30.self_attn.k_proj.weight": "model-00011-of-00034.safetensors", "model.layers.30.self_attn.q_proj.weight": "model-00011-of-00034.safetensors", "model.layers.30.mlp.up_proj.weight": "model-00011-of-00034.safetensors", "model.layers.30.mlp.gate_proj.weight": "model-00012-of-00034.safetensors", "model.layers.30.mlp.down_proj.weight": "model-00012-of-00034.safetensors", "model.layers.30.input_layernorm.weight": "model-00012-of-00034.safetensors", "model.layers.42.self_attn.k_norm.weight": "model-00012-of-00034.safetensors", "model.layers.33.self_attn.k_norm.weight": "model-00012-of-00034.safetensors", "model.layers.42.self_attn.q_norm.weight": "model-00012-of-00034.safetensors", "model.layers.33.self_attn.q_norm.weight": "model-00012-of-00034.safetensors", "model.layers.41.self_attn.o_proj.weight": "model-00012-of-00034.safetensors", "model.layers.32.self_attn.o_proj.weight": "model-00012-of-00034.safetensors", "model.layers.41.self_attn.v_proj.weight": "model-00012-of-00034.safetensors", "model.layers.32.self_attn.v_proj.weight": "model-00012-of-00034.safetensors", "model.layers.41.self_attn.k_proj.weight": "model-00012-of-00034.safetensors", "model.layers.32.self_attn.k_proj.weight": "model-00012-of-00034.safetensors", "model.layers.41.self_attn.k_norm.weight": "model-00012-of-00034.safetensors", "model.layers.32.self_attn.k_norm.weight": "model-00012-of-00034.safetensors", "model.layers.41.self_attn.q_proj.weight": "model-00012-of-00034.safetensors", "model.layers.32.self_attn.q_proj.weight": "model-00012-of-00034.safetensors", "model.layers.41.self_attn.q_norm.weight": "model-00012-of-00034.safetensors", "model.layers.32.self_attn.q_norm.weight": "model-00012-of-00034.safetensors", "model.layers.41.mlp.up_proj.weight": "model-00012-of-00034.safetensors", "model.layers.32.mlp.up_proj.weight": "model-00012-of-00034.safetensors", "model.layers.41.mlp.gate_proj.weight": "model-00012-of-00034.safetensors", "model.layers.32.mlp.gate_proj.weight": "model-00012-of-00034.safetensors", "model.layers.41.mlp.down_proj.weight": "model-00012-of-00034.safetensors", "model.layers.32.mlp.down_proj.weight": "model-00012-of-00034.safetensors", "model.layers.41.input_layernorm.weight": "model-00012-of-00034.safetensors", "model.layers.32.input_layernorm.weight": "model-00012-of-00034.safetensors", "model.layers.40.mlp.up_proj.weight": "model-00012-of-00034.safetensors", "model.layers.31.mlp.up_proj.weight": "model-00012-of-00034.safetensors", "model.layers.40.mlp.down_proj.weight": "model-00013-of-00034.safetensors", "model.layers.31.mlp.down_proj.weight": "model-00013-of-00034.safetensors", "model.layers.40.input_layernorm.weight": "model-00013-of-00034.safetensors", "model.layers.31.input_layernorm.weight": "model-00013-of-00034.safetensors", "model.layers.43.self_attn.o_proj.weight": "model-00013-of-00034.safetensors", "model.layers.34.self_attn.o_proj.weight": "model-00013-of-00034.safetensors", "model.layers.43.self_attn.v_proj.weight": "model-00013-of-00034.safetensors", "model.layers.34.self_attn.v_proj.weight": "model-00013-of-00034.safetensors", "model.layers.43.self_attn.k_proj.weight": "model-00013-of-00034.safetensors", "model.layers.34.self_attn.k_proj.weight": "model-00013-of-00034.safetensors", "model.layers.43.self_attn.k_norm.weight": "model-00013-of-00034.safetensors", "model.layers.34.self_attn.k_norm.weight": "model-00013-of-00034.safetensors", "model.layers.43.self_attn.q_proj.weight": "model-00013-of-00034.safetensors", "model.layers.34.self_attn.q_proj.weight": "model-00013-of-00034.safetensors", "model.layers.43.self_attn.q_norm.weight": "model-00013-of-00034.safetensors", "model.layers.34.self_attn.q_norm.weight": "model-00013-of-00034.safetensors", "model.layers.43.mlp.gate_proj.weight": "model-00013-of-00034.safetensors", "model.layers.34.mlp.gate_proj.weight": "model-00013-of-00034.safetensors", "model.layers.42.self_attn.o_proj.weight": "model-00013-of-00034.safetensors", "model.layers.33.self_attn.o_proj.weight": "model-00013-of-00034.safetensors", "model.layers.42.self_attn.v_proj.weight": "model-00013-of-00034.safetensors", "model.layers.33.self_attn.v_proj.weight": "model-00013-of-00034.safetensors", "model.layers.42.self_attn.k_proj.weight": "model-00013-of-00034.safetensors", "model.layers.33.self_attn.k_proj.weight": "model-00013-of-00034.safetensors", "model.layers.42.self_attn.q_proj.weight": "model-00013-of-00034.safetensors", "model.layers.33.self_attn.q_proj.weight": "model-00013-of-00034.safetensors", "model.layers.42.mlp.up_proj.weight": "model-00013-of-00034.safetensors", "model.layers.33.mlp.up_proj.weight": "model-00013-of-00034.safetensors", "model.layers.42.mlp.gate_proj.weight": "model-00013-of-00034.safetensors", "model.layers.33.mlp.gate_proj.weight": "model-00013-of-00034.safetensors", "model.layers.42.mlp.down_proj.weight": "model-00014-of-00034.safetensors", "model.layers.33.mlp.down_proj.weight": "model-00014-of-00034.safetensors", "model.layers.42.input_layernorm.weight": "model-00014-of-00034.safetensors", "model.layers.33.input_layernorm.weight": "model-00014-of-00034.safetensors", "model.layers.45.self_attn.k_norm.weight": "model-00014-of-00034.safetensors", "model.layers.36.self_attn.k_norm.weight": "model-00014-of-00034.safetensors", "model.layers.45.self_attn.q_norm.weight": "model-00014-of-00034.safetensors", "model.layers.36.self_attn.q_norm.weight": "model-00014-of-00034.safetensors", "model.layers.44.self_attn.o_proj.weight": "model-00014-of-00034.safetensors", "model.layers.35.self_attn.o_proj.weight": "model-00014-of-00034.safetensors", "model.layers.44.self_attn.v_proj.weight": "model-00014-of-00034.safetensors", "model.layers.35.self_attn.v_proj.weight": "model-00014-of-00034.safetensors", "model.layers.44.self_attn.k_proj.weight": "model-00014-of-00034.safetensors", "model.layers.35.self_attn.k_proj.weight": "model-00014-of-00034.safetensors", "model.layers.44.self_attn.k_norm.weight": "model-00014-of-00034.safetensors", "model.layers.35.self_attn.k_norm.weight": "model-00014-of-00034.safetensors", "model.layers.44.self_attn.q_proj.weight": "model-00014-of-00034.safetensors", "model.layers.35.self_attn.q_proj.weight": "model-00014-of-00034.safetensors", "model.layers.44.self_attn.q_norm.weight": "model-00014-of-00034.safetensors", "model.layers.35.self_attn.q_norm.weight": "model-00014-of-00034.safetensors", "model.layers.44.mlp.up_proj.weight": "model-00014-of-00034.safetensors", "model.layers.35.mlp.up_proj.weight": "model-00014-of-00034.safetensors", "model.layers.44.mlp.gate_proj.weight": "model-00014-of-00034.safetensors", "model.layers.35.mlp.gate_proj.weight": "model-00014-of-00034.safetensors", "model.layers.44.mlp.down_proj.weight": "model-00014-of-00034.safetensors", "model.layers.35.mlp.down_proj.weight": "model-00014-of-00034.safetensors", "model.layers.44.input_layernorm.weight": "model-00014-of-00034.safetensors", "model.layers.35.input_layernorm.weight": "model-00014-of-00034.safetensors", "model.layers.43.mlp.up_proj.weight": "model-00014-of-00034.safetensors", "model.layers.34.mlp.up_proj.weight": "model-00014-of-00034.safetensors", "model.layers.43.mlp.down_proj.weight": "model-00015-of-00034.safetensors", "model.layers.34.mlp.down_proj.weight": "model-00015-of-00034.safetensors", "model.layers.43.input_layernorm.weight": "model-00015-of-00034.safetensors", "model.layers.34.input_layernorm.weight": "model-00015-of-00034.safetensors", "model.layers.46.self_attn.o_proj.weight": "model-00015-of-00034.safetensors", "model.layers.37.self_attn.o_proj.weight": "model-00015-of-00034.safetensors", "model.layers.46.self_attn.v_proj.weight": "model-00015-of-00034.safetensors", "model.layers.37.self_attn.v_proj.weight": "model-00015-of-00034.safetensors", "model.layers.46.self_attn.k_proj.weight": "model-00015-of-00034.safetensors", "model.layers.37.self_attn.k_proj.weight": "model-00015-of-00034.safetensors", "model.layers.46.self_attn.k_norm.weight": "model-00015-of-00034.safetensors", "model.layers.37.self_attn.k_norm.weight": "model-00015-of-00034.safetensors", "model.layers.46.self_attn.q_proj.weight": "model-00015-of-00034.safetensors", "model.layers.37.self_attn.q_proj.weight": "model-00015-of-00034.safetensors", "model.layers.46.self_attn.q_norm.weight": "model-00015-of-00034.safetensors", "model.layers.37.self_attn.q_norm.weight": "model-00015-of-00034.safetensors", "model.layers.46.mlp.gate_proj.weight": "model-00015-of-00034.safetensors", "model.layers.37.mlp.gate_proj.weight": "model-00015-of-00034.safetensors", "model.layers.45.self_attn.o_proj.weight": "model-00015-of-00034.safetensors", "model.layers.36.self_attn.o_proj.weight": "model-00015-of-00034.safetensors", "model.layers.45.self_attn.v_proj.weight": "model-00015-of-00034.safetensors", "model.layers.36.self_attn.v_proj.weight": "model-00015-of-00034.safetensors", "model.layers.45.self_attn.k_proj.weight": "model-00015-of-00034.safetensors", "model.layers.36.self_attn.k_proj.weight": "model-00015-of-00034.safetensors", "model.layers.45.self_attn.q_proj.weight": "model-00015-of-00034.safetensors", "model.layers.36.self_attn.q_proj.weight": "model-00015-of-00034.safetensors", "model.layers.45.mlp.up_proj.weight": "model-00015-of-00034.safetensors", "model.layers.36.mlp.up_proj.weight": "model-00015-of-00034.safetensors", "model.layers.45.mlp.gate_proj.weight": "model-00015-of-00034.safetensors", "model.layers.36.mlp.gate_proj.weight": "model-00015-of-00034.safetensors", "model.layers.45.mlp.down_proj.weight": "model-00016-of-00034.safetensors", "model.layers.36.mlp.down_proj.weight": "model-00016-of-00034.safetensors", "model.layers.45.input_layernorm.weight": "model-00016-of-00034.safetensors", "model.layers.36.input_layernorm.weight": "model-00016-of-00034.safetensors", "model.layers.48.self_attn.k_norm.weight": "model-00016-of-00034.safetensors", "model.layers.39.self_attn.k_norm.weight": "model-00016-of-00034.safetensors", "model.layers.48.self_attn.q_norm.weight": "model-00016-of-00034.safetensors", "model.layers.39.self_attn.q_norm.weight": "model-00016-of-00034.safetensors", "model.layers.47.self_attn.o_proj.weight": "model-00016-of-00034.safetensors", "model.layers.38.self_attn.o_proj.weight": "model-00016-of-00034.safetensors", "model.layers.47.self_attn.v_proj.weight": "model-00016-of-00034.safetensors", "model.layers.38.self_attn.v_proj.weight": "model-00016-of-00034.safetensors", "model.layers.47.self_attn.k_proj.weight": "model-00016-of-00034.safetensors", "model.layers.38.self_attn.k_proj.weight": "model-00016-of-00034.safetensors", "model.layers.47.self_attn.k_norm.weight": "model-00016-of-00034.safetensors", "model.layers.38.self_attn.k_norm.weight": "model-00016-of-00034.safetensors", "model.layers.47.self_attn.q_proj.weight": "model-00016-of-00034.safetensors", "model.layers.38.self_attn.q_proj.weight": "model-00016-of-00034.safetensors", "model.layers.47.self_attn.q_norm.weight": "model-00016-of-00034.safetensors", "model.layers.38.self_attn.q_norm.weight": "model-00016-of-00034.safetensors", "model.layers.47.mlp.up_proj.weight": "model-00016-of-00034.safetensors", "model.layers.38.mlp.up_proj.weight": "model-00016-of-00034.safetensors", "model.layers.47.mlp.gate_proj.weight": "model-00016-of-00034.safetensors", "model.layers.38.mlp.gate_proj.weight": "model-00016-of-00034.safetensors", "model.layers.47.mlp.down_proj.weight": "model-00016-of-00034.safetensors", "model.layers.38.mlp.down_proj.weight": "model-00016-of-00034.safetensors", "model.layers.47.input_layernorm.weight": "model-00016-of-00034.safetensors", "model.layers.38.input_layernorm.weight": "model-00016-of-00034.safetensors", "model.layers.46.mlp.up_proj.weight": "model-00016-of-00034.safetensors", "model.layers.37.mlp.up_proj.weight": "model-00016-of-00034.safetensors", "model.layers.46.mlp.down_proj.weight": "model-00017-of-00034.safetensors", "model.layers.37.mlp.down_proj.weight": "model-00017-of-00034.safetensors", "model.layers.46.input_layernorm.weight": "model-00017-of-00034.safetensors", "model.layers.37.input_layernorm.weight": "model-00017-of-00034.safetensors", "model.layers.49.self_attn.o_proj.weight": "model-00017-of-00034.safetensors", "model.layers.49.self_attn.v_proj.weight": "model-00017-of-00034.safetensors", "model.layers.49.self_attn.k_proj.weight": "model-00017-of-00034.safetensors", "model.layers.49.self_attn.k_norm.weight": "model-00017-of-00034.safetensors", "model.layers.49.self_attn.q_proj.weight": "model-00017-of-00034.safetensors", "model.layers.49.self_attn.q_norm.weight": "model-00017-of-00034.safetensors", "model.layers.49.mlp.gate_proj.weight": "model-00017-of-00034.safetensors", "model.layers.48.self_attn.o_proj.weight": "model-00017-of-00034.safetensors", "model.layers.39.self_attn.o_proj.weight": "model-00017-of-00034.safetensors", "model.layers.48.self_attn.v_proj.weight": "model-00017-of-00034.safetensors", "model.layers.39.self_attn.v_proj.weight": "model-00017-of-00034.safetensors", "model.layers.48.self_attn.k_proj.weight": "model-00017-of-00034.safetensors", "model.layers.39.self_attn.k_proj.weight": "model-00017-of-00034.safetensors", "model.layers.48.self_attn.q_proj.weight": "model-00017-of-00034.safetensors", "model.layers.39.self_attn.q_proj.weight": "model-00017-of-00034.safetensors", "model.layers.48.mlp.up_proj.weight": "model-00017-of-00034.safetensors", "model.layers.39.mlp.up_proj.weight": "model-00017-of-00034.safetensors", "model.layers.48.mlp.gate_proj.weight": "model-00017-of-00034.safetensors", "model.layers.39.mlp.gate_proj.weight": "model-00017-of-00034.safetensors", "model.layers.48.mlp.down_proj.weight": "model-00017-of-00034.safetensors", "model.layers.39.mlp.down_proj.weight": "model-00017-of-00034.safetensors", "model.layers.48.input_layernorm.weight": "model-00017-of-00034.safetensors", "model.layers.39.input_layernorm.weight": "model-00017-of-00034.safetensors", "model.layers.60.self_attn.k_norm.weight": "model-00017-of-00034.safetensors", "model.layers.51.self_attn.k_norm.weight": "model-00017-of-00034.safetensors", "model.layers.60.self_attn.q_norm.weight": "model-00017-of-00034.safetensors", "model.layers.51.self_attn.q_norm.weight": "model-00017-of-00034.safetensors", "model.layers.50.self_attn.o_proj.weight": "model-00017-of-00034.safetensors", "model.layers.50.self_attn.v_proj.weight": "model-00017-of-00034.safetensors", "model.layers.50.self_attn.k_proj.weight": "model-00017-of-00034.safetensors", "model.layers.50.self_attn.k_norm.weight": "model-00017-of-00034.safetensors", "model.layers.50.self_attn.q_proj.weight": "model-00018-of-00034.safetensors", "model.layers.50.self_attn.q_norm.weight": "model-00018-of-00034.safetensors", "model.layers.50.mlp.up_proj.weight": "model-00018-of-00034.safetensors", "model.layers.50.mlp.gate_proj.weight": "model-00018-of-00034.safetensors", "model.layers.50.mlp.down_proj.weight": "model-00018-of-00034.safetensors", "model.layers.50.input_layernorm.weight": "model-00018-of-00034.safetensors", "model.layers.49.mlp.up_proj.weight": "model-00018-of-00034.safetensors", "model.layers.49.mlp.down_proj.weight": "model-00018-of-00034.safetensors", "model.layers.49.input_layernorm.weight": "model-00018-of-00034.safetensors", "model.layers.61.self_attn.o_proj.weight": "model-00018-of-00034.safetensors", "model.layers.52.self_attn.o_proj.weight": "model-00018-of-00034.safetensors", "model.layers.61.self_attn.v_proj.weight": "model-00018-of-00034.safetensors", "model.layers.52.self_attn.v_proj.weight": "model-00018-of-00034.safetensors", "model.layers.61.self_attn.k_proj.weight": "model-00018-of-00034.safetensors", "model.layers.52.self_attn.k_proj.weight": "model-00018-of-00034.safetensors", "model.layers.61.self_attn.k_norm.weight": "model-00018-of-00034.safetensors", "model.layers.52.self_attn.k_norm.weight": "model-00018-of-00034.safetensors", "model.layers.61.self_attn.q_proj.weight": "model-00018-of-00034.safetensors", "model.layers.52.self_attn.q_proj.weight": "model-00018-of-00034.safetensors", "model.layers.61.self_attn.q_norm.weight": "model-00018-of-00034.safetensors", "model.layers.52.self_attn.q_norm.weight": "model-00018-of-00034.safetensors", "model.layers.61.mlp.gate_proj.weight": "model-00018-of-00034.safetensors", "model.layers.52.mlp.gate_proj.weight": "model-00018-of-00034.safetensors", "model.layers.60.self_attn.o_proj.weight": "model-00018-of-00034.safetensors", "model.layers.51.self_attn.o_proj.weight": "model-00018-of-00034.safetensors", "model.layers.60.self_attn.v_proj.weight": "model-00018-of-00034.safetensors", "model.layers.51.self_attn.v_proj.weight": "model-00018-of-00034.safetensors", "model.layers.60.self_attn.k_proj.weight": "model-00018-of-00034.safetensors", "model.layers.51.self_attn.k_proj.weight": "model-00018-of-00034.safetensors", "model.layers.60.self_attn.q_proj.weight": "model-00018-of-00034.safetensors", "model.layers.51.self_attn.q_proj.weight": "model-00018-of-00034.safetensors", "model.layers.60.mlp.up_proj.weight": "model-00018-of-00034.safetensors", "model.layers.51.mlp.up_proj.weight": "model-00019-of-00034.safetensors", "model.layers.60.mlp.gate_proj.weight": "model-00019-of-00034.safetensors", "model.layers.51.mlp.gate_proj.weight": "model-00019-of-00034.safetensors", "model.layers.60.mlp.down_proj.weight": "model-00019-of-00034.safetensors", "model.layers.51.mlp.down_proj.weight": "model-00019-of-00034.safetensors", "model.layers.60.input_layernorm.weight": "model-00019-of-00034.safetensors", "model.layers.51.input_layernorm.weight": "model-00019-of-00034.safetensors", "model.layers.63.self_attn.k_norm.weight": "model-00019-of-00034.safetensors", "model.layers.54.self_attn.k_norm.weight": "model-00019-of-00034.safetensors", "model.layers.63.self_attn.q_norm.weight": "model-00019-of-00034.safetensors", "model.layers.54.self_attn.q_norm.weight": "model-00019-of-00034.safetensors", "model.layers.62.self_attn.o_proj.weight": "model-00019-of-00034.safetensors", "model.layers.53.self_attn.o_proj.weight": "model-00019-of-00034.safetensors", "model.layers.62.self_attn.v_proj.weight": "model-00019-of-00034.safetensors", "model.layers.53.self_attn.v_proj.weight": "model-00019-of-00034.safetensors", "model.layers.62.self_attn.k_proj.weight": "model-00019-of-00034.safetensors", "model.layers.53.self_attn.k_proj.weight": "model-00019-of-00034.safetensors", "model.layers.62.self_attn.k_norm.weight": "model-00019-of-00034.safetensors", "model.layers.53.self_attn.k_norm.weight": "model-00019-of-00034.safetensors", "model.layers.62.self_attn.q_proj.weight": "model-00019-of-00034.safetensors", "model.layers.53.self_attn.q_proj.weight": "model-00019-of-00034.safetensors", "model.layers.62.self_attn.q_norm.weight": "model-00019-of-00034.safetensors", "model.layers.53.self_attn.q_norm.weight": "model-00019-of-00034.safetensors", "model.layers.62.mlp.up_proj.weight": "model-00019-of-00034.safetensors", "model.layers.53.mlp.up_proj.weight": "model-00019-of-00034.safetensors", "model.layers.62.mlp.gate_proj.weight": "model-00019-of-00034.safetensors", "model.layers.53.mlp.gate_proj.weight": "model-00019-of-00034.safetensors", "model.layers.62.mlp.down_proj.weight": "model-00019-of-00034.safetensors", "model.layers.53.mlp.down_proj.weight": "model-00020-of-00034.safetensors", "model.layers.62.input_layernorm.weight": "model-00020-of-00034.safetensors", "model.layers.53.input_layernorm.weight": "model-00020-of-00034.safetensors", "model.layers.61.mlp.up_proj.weight": "model-00020-of-00034.safetensors", "model.layers.52.mlp.up_proj.weight": "model-00020-of-00034.safetensors", "model.layers.61.mlp.down_proj.weight": "model-00020-of-00034.safetensors", "model.layers.52.mlp.down_proj.weight": "model-00020-of-00034.safetensors", "model.layers.61.input_layernorm.weight": "model-00020-of-00034.safetensors", "model.layers.52.input_layernorm.weight": "model-00020-of-00034.safetensors", "model.layers.64.self_attn.o_proj.weight": "model-00020-of-00034.safetensors", "model.layers.55.self_attn.o_proj.weight": "model-00020-of-00034.safetensors", "model.layers.64.self_attn.v_proj.weight": "model-00020-of-00034.safetensors", "model.layers.55.self_attn.v_proj.weight": "model-00020-of-00034.safetensors", "model.layers.64.self_attn.k_proj.weight": "model-00020-of-00034.safetensors", "model.layers.55.self_attn.k_proj.weight": "model-00020-of-00034.safetensors", "model.layers.64.self_attn.k_norm.weight": "model-00020-of-00034.safetensors", "model.layers.55.self_attn.k_norm.weight": "model-00020-of-00034.safetensors", "model.layers.64.self_attn.q_proj.weight": "model-00020-of-00034.safetensors", "model.layers.55.self_attn.q_proj.weight": "model-00020-of-00034.safetensors", "model.layers.64.self_attn.q_norm.weight": "model-00020-of-00034.safetensors", "model.layers.55.self_attn.q_norm.weight": "model-00020-of-00034.safetensors", "model.layers.64.mlp.gate_proj.weight": "model-00020-of-00034.safetensors", "model.layers.55.mlp.gate_proj.weight": "model-00020-of-00034.safetensors", "model.layers.63.self_attn.o_proj.weight": "model-00020-of-00034.safetensors", "model.layers.54.self_attn.o_proj.weight": "model-00020-of-00034.safetensors", "model.layers.63.self_attn.v_proj.weight": "model-00020-of-00034.safetensors", "model.layers.54.self_attn.v_proj.weight": "model-00020-of-00034.safetensors", "model.layers.63.self_attn.k_proj.weight": "model-00020-of-00034.safetensors", "model.layers.54.self_attn.k_proj.weight": "model-00020-of-00034.safetensors", "model.layers.63.self_attn.q_proj.weight": "model-00020-of-00034.safetensors", "model.layers.54.self_attn.q_proj.weight": "model-00020-of-00034.safetensors", "model.layers.63.mlp.up_proj.weight": "model-00020-of-00034.safetensors", "model.layers.54.mlp.up_proj.weight": "model-00021-of-00034.safetensors", "model.layers.63.mlp.gate_proj.weight": "model-00021-of-00034.safetensors", "model.layers.54.mlp.gate_proj.weight": "model-00021-of-00034.safetensors", "model.layers.63.mlp.down_proj.weight": "model-00021-of-00034.safetensors", "model.layers.54.mlp.down_proj.weight": "model-00021-of-00034.safetensors", "model.layers.63.input_layernorm.weight": "model-00021-of-00034.safetensors", "model.layers.54.input_layernorm.weight": "model-00021-of-00034.safetensors", "model.layers.66.self_attn.k_norm.weight": "model-00021-of-00034.safetensors", "model.layers.57.self_attn.k_norm.weight": "model-00021-of-00034.safetensors", "model.layers.66.self_attn.q_norm.weight": "model-00021-of-00034.safetensors", "model.layers.57.self_attn.q_norm.weight": "model-00021-of-00034.safetensors", "model.layers.65.self_attn.o_proj.weight": "model-00021-of-00034.safetensors", "model.layers.56.self_attn.o_proj.weight": "model-00021-of-00034.safetensors", "model.layers.65.self_attn.v_proj.weight": "model-00021-of-00034.safetensors", "model.layers.56.self_attn.v_proj.weight": "model-00021-of-00034.safetensors", "model.layers.65.self_attn.k_proj.weight": "model-00021-of-00034.safetensors", "model.layers.56.self_attn.k_proj.weight": "model-00021-of-00034.safetensors", "model.layers.65.self_attn.k_norm.weight": "model-00021-of-00034.safetensors", "model.layers.56.self_attn.k_norm.weight": "model-00021-of-00034.safetensors", "model.layers.65.self_attn.q_proj.weight": "model-00021-of-00034.safetensors", "model.layers.56.self_attn.q_proj.weight": "model-00021-of-00034.safetensors", "model.layers.65.self_attn.q_norm.weight": "model-00021-of-00034.safetensors", "model.layers.56.self_attn.q_norm.weight": "model-00021-of-00034.safetensors", "model.layers.65.mlp.up_proj.weight": "model-00021-of-00034.safetensors", "model.layers.56.mlp.up_proj.weight": "model-00021-of-00034.safetensors", "model.layers.65.mlp.gate_proj.weight": "model-00021-of-00034.safetensors", "model.layers.56.mlp.gate_proj.weight": "model-00021-of-00034.safetensors", "model.layers.65.mlp.down_proj.weight": "model-00021-of-00034.safetensors", "model.layers.56.mlp.down_proj.weight": "model-00022-of-00034.safetensors", "model.layers.65.input_layernorm.weight": "model-00022-of-00034.safetensors", "model.layers.56.input_layernorm.weight": "model-00022-of-00034.safetensors", "model.layers.64.mlp.up_proj.weight": "model-00022-of-00034.safetensors", "model.layers.55.mlp.up_proj.weight": "model-00022-of-00034.safetensors", "model.layers.64.mlp.down_proj.weight": "model-00022-of-00034.safetensors", "model.layers.55.mlp.down_proj.weight": "model-00022-of-00034.safetensors", "model.layers.64.input_layernorm.weight": "model-00022-of-00034.safetensors", "model.layers.55.input_layernorm.weight": "model-00022-of-00034.safetensors", "model.layers.67.self_attn.o_proj.weight": "model-00022-of-00034.safetensors", "model.layers.58.self_attn.o_proj.weight": "model-00022-of-00034.safetensors", "model.layers.67.self_attn.v_proj.weight": "model-00022-of-00034.safetensors", "model.layers.58.self_attn.v_proj.weight": "model-00022-of-00034.safetensors", "model.layers.67.self_attn.k_proj.weight": "model-00022-of-00034.safetensors", "model.layers.58.self_attn.k_proj.weight": "model-00022-of-00034.safetensors", "model.layers.67.self_attn.k_norm.weight": "model-00022-of-00034.safetensors", "model.layers.58.self_attn.k_norm.weight": "model-00022-of-00034.safetensors", "model.layers.67.self_attn.q_proj.weight": "model-00022-of-00034.safetensors", "model.layers.58.self_attn.q_proj.weight": "model-00022-of-00034.safetensors", "model.layers.67.self_attn.q_norm.weight": "model-00022-of-00034.safetensors", "model.layers.58.self_attn.q_norm.weight": "model-00022-of-00034.safetensors", "model.layers.67.mlp.gate_proj.weight": "model-00022-of-00034.safetensors", "model.layers.58.mlp.gate_proj.weight": "model-00022-of-00034.safetensors", "model.layers.66.self_attn.o_proj.weight": "model-00022-of-00034.safetensors", "model.layers.57.self_attn.o_proj.weight": "model-00022-of-00034.safetensors", "model.layers.66.self_attn.v_proj.weight": "model-00022-of-00034.safetensors", "model.layers.57.self_attn.v_proj.weight": "model-00022-of-00034.safetensors", "model.layers.66.self_attn.k_proj.weight": "model-00022-of-00034.safetensors", "model.layers.57.self_attn.k_proj.weight": "model-00022-of-00034.safetensors", "model.layers.66.self_attn.q_proj.weight": "model-00022-of-00034.safetensors", "model.layers.57.self_attn.q_proj.weight": "model-00022-of-00034.safetensors", "model.layers.66.mlp.up_proj.weight": "model-00022-of-00034.safetensors", "model.layers.57.mlp.up_proj.weight": "model-00023-of-00034.safetensors", "model.layers.66.mlp.gate_proj.weight": "model-00023-of-00034.safetensors", "model.layers.57.mlp.gate_proj.weight": "model-00023-of-00034.safetensors", "model.layers.66.mlp.down_proj.weight": "model-00023-of-00034.safetensors", "model.layers.57.mlp.down_proj.weight": "model-00023-of-00034.safetensors", "model.layers.66.input_layernorm.weight": "model-00023-of-00034.safetensors", "model.layers.57.input_layernorm.weight": "model-00023-of-00034.safetensors", "model.layers.69.self_attn.k_norm.weight": "model-00023-of-00034.safetensors", "model.layers.69.self_attn.q_norm.weight": "model-00023-of-00034.safetensors", "model.layers.68.self_attn.o_proj.weight": "model-00023-of-00034.safetensors", "model.layers.59.self_attn.o_proj.weight": "model-00023-of-00034.safetensors", "model.layers.68.self_attn.v_proj.weight": "model-00023-of-00034.safetensors", "model.layers.59.self_attn.v_proj.weight": "model-00023-of-00034.safetensors", "model.layers.68.self_attn.k_proj.weight": "model-00023-of-00034.safetensors", "model.layers.59.self_attn.k_proj.weight": "model-00023-of-00034.safetensors", "model.layers.68.self_attn.k_norm.weight": "model-00023-of-00034.safetensors", "model.layers.59.self_attn.k_norm.weight": "model-00023-of-00034.safetensors", "model.layers.68.self_attn.q_proj.weight": "model-00023-of-00034.safetensors", "model.layers.59.self_attn.q_proj.weight": "model-00023-of-00034.safetensors", "model.layers.68.self_attn.q_norm.weight": "model-00023-of-00034.safetensors", "model.layers.59.self_attn.q_norm.weight": "model-00023-of-00034.safetensors", "model.layers.68.mlp.up_proj.weight": "model-00023-of-00034.safetensors", "model.layers.59.mlp.up_proj.weight": "model-00023-of-00034.safetensors", "model.layers.68.mlp.gate_proj.weight": "model-00023-of-00034.safetensors", "model.layers.59.mlp.gate_proj.weight": "model-00023-of-00034.safetensors", "model.layers.68.mlp.down_proj.weight": "model-00023-of-00034.safetensors", "model.layers.59.mlp.down_proj.weight": "model-00024-of-00034.safetensors", "model.layers.68.input_layernorm.weight": "model-00024-of-00034.safetensors", "model.layers.59.input_layernorm.weight": "model-00024-of-00034.safetensors", "model.layers.67.mlp.up_proj.weight": "model-00024-of-00034.safetensors", "model.layers.58.mlp.up_proj.weight": "model-00024-of-00034.safetensors", "model.layers.67.mlp.down_proj.weight": "model-00024-of-00034.safetensors", "model.layers.58.mlp.down_proj.weight": "model-00024-of-00034.safetensors", "model.layers.67.input_layernorm.weight": "model-00024-of-00034.safetensors", "model.layers.58.input_layernorm.weight": "model-00024-of-00034.safetensors", "model.layers.70.self_attn.o_proj.weight": "model-00024-of-00034.safetensors", "model.layers.70.self_attn.v_proj.weight": "model-00024-of-00034.safetensors", "model.layers.70.self_attn.k_proj.weight": "model-00024-of-00034.safetensors", "model.layers.70.self_attn.k_norm.weight": "model-00024-of-00034.safetensors", "model.layers.70.self_attn.q_proj.weight": "model-00024-of-00034.safetensors", "model.layers.70.self_attn.q_norm.weight": "model-00024-of-00034.safetensors", "model.layers.70.mlp.gate_proj.weight": "model-00024-of-00034.safetensors", "model.layers.69.self_attn.o_proj.weight": "model-00024-of-00034.safetensors", "model.layers.69.self_attn.v_proj.weight": "model-00024-of-00034.safetensors", "model.layers.69.self_attn.k_proj.weight": "model-00024-of-00034.safetensors", "model.layers.69.self_attn.q_proj.weight": "model-00024-of-00034.safetensors", "model.layers.69.mlp.up_proj.weight": "model-00024-of-00034.safetensors", "model.layers.69.mlp.gate_proj.weight": "model-00024-of-00034.safetensors", "model.layers.69.mlp.down_proj.weight": "model-00024-of-00034.safetensors", "model.layers.69.input_layernorm.weight": "model-00024-of-00034.safetensors", "model.layers.81.self_attn.k_norm.weight": "model-00024-of-00034.safetensors", "model.layers.72.self_attn.k_norm.weight": "model-00024-of-00034.safetensors", "model.layers.81.self_attn.q_norm.weight": "model-00024-of-00034.safetensors", "model.layers.72.self_attn.q_norm.weight": "model-00024-of-00034.safetensors", "model.layers.80.self_attn.o_proj.weight": "model-00024-of-00034.safetensors", "model.layers.71.self_attn.o_proj.weight": "model-00024-of-00034.safetensors", "model.layers.80.self_attn.v_proj.weight": "model-00024-of-00034.safetensors", "model.layers.71.self_attn.v_proj.weight": "model-00024-of-00034.safetensors", "model.layers.80.self_attn.k_proj.weight": "model-00024-of-00034.safetensors", "model.layers.71.self_attn.k_proj.weight": "model-00024-of-00034.safetensors", "model.layers.80.self_attn.k_norm.weight": "model-00024-of-00034.safetensors", "model.layers.71.self_attn.k_norm.weight": "model-00024-of-00034.safetensors", "model.layers.80.self_attn.q_proj.weight": "model-00024-of-00034.safetensors", "model.layers.71.self_attn.q_proj.weight": "model-00025-of-00034.safetensors", "model.layers.80.self_attn.q_norm.weight": "model-00025-of-00034.safetensors", "model.layers.71.self_attn.q_norm.weight": "model-00025-of-00034.safetensors", "model.layers.80.mlp.up_proj.weight": "model-00025-of-00034.safetensors", "model.layers.71.mlp.up_proj.weight": "model-00025-of-00034.safetensors", "model.layers.80.mlp.gate_proj.weight": "model-00025-of-00034.safetensors", "model.layers.71.mlp.gate_proj.weight": "model-00025-of-00034.safetensors", "model.layers.80.mlp.down_proj.weight": "model-00025-of-00034.safetensors", "model.layers.71.mlp.down_proj.weight": "model-00025-of-00034.safetensors", "model.layers.80.input_layernorm.weight": "model-00025-of-00034.safetensors", "model.layers.71.input_layernorm.weight": "model-00025-of-00034.safetensors", "model.layers.70.mlp.up_proj.weight": "model-00025-of-00034.safetensors", "model.layers.70.mlp.down_proj.weight": "model-00025-of-00034.safetensors", "model.layers.70.input_layernorm.weight": "model-00025-of-00034.safetensors", "model.layers.82.self_attn.o_proj.weight": "model-00025-of-00034.safetensors", "model.layers.73.self_attn.o_proj.weight": "model-00025-of-00034.safetensors", "model.layers.82.self_attn.v_proj.weight": "model-00025-of-00034.safetensors", "model.layers.73.self_attn.v_proj.weight": "model-00025-of-00034.safetensors", "model.layers.82.self_attn.k_proj.weight": "model-00025-of-00034.safetensors", "model.layers.73.self_attn.k_proj.weight": "model-00025-of-00034.safetensors", "model.layers.82.self_attn.k_norm.weight": "model-00025-of-00034.safetensors", "model.layers.73.self_attn.k_norm.weight": "model-00025-of-00034.safetensors", "model.layers.82.self_attn.q_proj.weight": "model-00025-of-00034.safetensors", "model.layers.73.self_attn.q_proj.weight": "model-00025-of-00034.safetensors", "model.layers.82.self_attn.q_norm.weight": "model-00025-of-00034.safetensors", "model.layers.73.self_attn.q_norm.weight": "model-00025-of-00034.safetensors", "model.layers.82.mlp.gate_proj.weight": "model-00025-of-00034.safetensors", "model.layers.73.mlp.gate_proj.weight": "model-00025-of-00034.safetensors", "model.layers.81.self_attn.o_proj.weight": "model-00026-of-00034.safetensors", "model.layers.72.self_attn.o_proj.weight": "model-00026-of-00034.safetensors", "model.layers.81.self_attn.v_proj.weight": "model-00026-of-00034.safetensors", "model.layers.72.self_attn.v_proj.weight": "model-00026-of-00034.safetensors", "model.layers.81.self_attn.k_proj.weight": "model-00026-of-00034.safetensors", "model.layers.72.self_attn.k_proj.weight": "model-00026-of-00034.safetensors", "model.layers.81.self_attn.q_proj.weight": "model-00026-of-00034.safetensors", "model.layers.72.self_attn.q_proj.weight": "model-00026-of-00034.safetensors", "model.layers.81.mlp.up_proj.weight": "model-00026-of-00034.safetensors", "model.layers.72.mlp.up_proj.weight": "model-00026-of-00034.safetensors", "model.layers.81.mlp.gate_proj.weight": "model-00026-of-00034.safetensors", "model.layers.72.mlp.gate_proj.weight": "model-00026-of-00034.safetensors", "model.layers.81.mlp.down_proj.weight": "model-00026-of-00034.safetensors", "model.layers.72.mlp.down_proj.weight": "model-00026-of-00034.safetensors", "model.layers.81.input_layernorm.weight": "model-00026-of-00034.safetensors", "model.layers.72.input_layernorm.weight": "model-00026-of-00034.safetensors", "model.layers.84.self_attn.k_norm.weight": "model-00026-of-00034.safetensors", "model.layers.75.self_attn.k_norm.weight": "model-00026-of-00034.safetensors", "model.layers.84.self_attn.q_norm.weight": "model-00026-of-00034.safetensors", "model.layers.75.self_attn.q_norm.weight": "model-00026-of-00034.safetensors", "model.layers.83.self_attn.o_proj.weight": "model-00026-of-00034.safetensors", "model.layers.74.self_attn.o_proj.weight": "model-00026-of-00034.safetensors", "model.layers.83.self_attn.v_proj.weight": "model-00026-of-00034.safetensors", "model.layers.74.self_attn.v_proj.weight": "model-00026-of-00034.safetensors", "model.layers.83.self_attn.k_proj.weight": "model-00026-of-00034.safetensors", "model.layers.74.self_attn.k_proj.weight": "model-00026-of-00034.safetensors", "model.layers.83.self_attn.k_norm.weight": "model-00026-of-00034.safetensors", "model.layers.74.self_attn.k_norm.weight": "model-00026-of-00034.safetensors", "model.layers.83.self_attn.q_proj.weight": "model-00026-of-00034.safetensors", "model.layers.74.self_attn.q_proj.weight": "model-00026-of-00034.safetensors", "model.layers.83.self_attn.q_norm.weight": "model-00026-of-00034.safetensors", "model.layers.74.self_attn.q_norm.weight": "model-00026-of-00034.safetensors", "model.layers.83.mlp.up_proj.weight": "model-00026-of-00034.safetensors", "model.layers.74.mlp.up_proj.weight": "model-00026-of-00034.safetensors", "model.layers.83.mlp.gate_proj.weight": "model-00027-of-00034.safetensors", "model.layers.74.mlp.gate_proj.weight": "model-00027-of-00034.safetensors", "model.layers.83.mlp.down_proj.weight": "model-00027-of-00034.safetensors", "model.layers.74.mlp.down_proj.weight": "model-00027-of-00034.safetensors", "model.layers.83.input_layernorm.weight": "model-00027-of-00034.safetensors", "model.layers.74.input_layernorm.weight": "model-00027-of-00034.safetensors", "model.layers.82.mlp.up_proj.weight": "model-00027-of-00034.safetensors", "model.layers.73.mlp.up_proj.weight": "model-00027-of-00034.safetensors", "model.layers.82.mlp.down_proj.weight": "model-00027-of-00034.safetensors", "model.layers.73.mlp.down_proj.weight": "model-00027-of-00034.safetensors", "model.layers.82.input_layernorm.weight": "model-00027-of-00034.safetensors", "model.layers.73.input_layernorm.weight": "model-00027-of-00034.safetensors", "model.layers.85.self_attn.o_proj.weight": "model-00027-of-00034.safetensors", "model.layers.76.self_attn.o_proj.weight": "model-00027-of-00034.safetensors", "model.layers.85.self_attn.v_proj.weight": "model-00027-of-00034.safetensors", "model.layers.76.self_attn.v_proj.weight": "model-00027-of-00034.safetensors", "model.layers.85.self_attn.k_proj.weight": "model-00027-of-00034.safetensors", "model.layers.76.self_attn.k_proj.weight": "model-00027-of-00034.safetensors", "model.layers.85.self_attn.k_norm.weight": "model-00027-of-00034.safetensors", "model.layers.76.self_attn.k_norm.weight": "model-00027-of-00034.safetensors", "model.layers.85.self_attn.q_proj.weight": "model-00027-of-00034.safetensors", "model.layers.76.self_attn.q_proj.weight": "model-00027-of-00034.safetensors", "model.layers.85.self_attn.q_norm.weight": "model-00027-of-00034.safetensors", "model.layers.76.self_attn.q_norm.weight": "model-00027-of-00034.safetensors", "model.layers.85.mlp.gate_proj.weight": "model-00027-of-00034.safetensors", "model.layers.76.mlp.gate_proj.weight": "model-00027-of-00034.safetensors", "model.layers.84.self_attn.o_proj.weight": "model-00027-of-00034.safetensors", "model.layers.75.self_attn.o_proj.weight": "model-00028-of-00034.safetensors", "model.layers.84.self_attn.v_proj.weight": "model-00028-of-00034.safetensors", "model.layers.75.self_attn.v_proj.weight": "model-00028-of-00034.safetensors", "model.layers.84.self_attn.k_proj.weight": "model-00028-of-00034.safetensors", "model.layers.75.self_attn.k_proj.weight": "model-00028-of-00034.safetensors", "model.layers.84.self_attn.q_proj.weight": "model-00028-of-00034.safetensors", "model.layers.75.self_attn.q_proj.weight": "model-00028-of-00034.safetensors", "model.layers.84.mlp.up_proj.weight": "model-00028-of-00034.safetensors", "model.layers.75.mlp.up_proj.weight": "model-00028-of-00034.safetensors", "model.layers.84.mlp.gate_proj.weight": "model-00028-of-00034.safetensors", "model.layers.75.mlp.gate_proj.weight": "model-00028-of-00034.safetensors", "model.layers.84.mlp.down_proj.weight": "model-00028-of-00034.safetensors", "model.layers.75.mlp.down_proj.weight": "model-00028-of-00034.safetensors", "model.layers.84.input_layernorm.weight": "model-00028-of-00034.safetensors", "model.layers.75.input_layernorm.weight": "model-00028-of-00034.safetensors", "model.layers.87.self_attn.k_norm.weight": "model-00028-of-00034.safetensors", "model.layers.78.self_attn.k_norm.weight": "model-00028-of-00034.safetensors", "model.layers.87.self_attn.q_norm.weight": "model-00028-of-00034.safetensors", "model.layers.78.self_attn.q_norm.weight": "model-00028-of-00034.safetensors", "model.layers.86.self_attn.o_proj.weight": "model-00028-of-00034.safetensors", "model.layers.77.self_attn.o_proj.weight": "model-00028-of-00034.safetensors", "model.layers.86.self_attn.v_proj.weight": "model-00028-of-00034.safetensors", "model.layers.77.self_attn.v_proj.weight": "model-00028-of-00034.safetensors", "model.layers.86.self_attn.k_proj.weight": "model-00028-of-00034.safetensors", "model.layers.77.self_attn.k_proj.weight": "model-00028-of-00034.safetensors", "model.layers.86.self_attn.k_norm.weight": "model-00028-of-00034.safetensors", "model.layers.77.self_attn.k_norm.weight": "model-00028-of-00034.safetensors", "model.layers.86.self_attn.q_proj.weight": "model-00028-of-00034.safetensors", "model.layers.77.self_attn.q_proj.weight": "model-00028-of-00034.safetensors", "model.layers.86.self_attn.q_norm.weight": "model-00028-of-00034.safetensors", "model.layers.77.self_attn.q_norm.weight": "model-00028-of-00034.safetensors", "model.layers.86.mlp.up_proj.weight": "model-00028-of-00034.safetensors", "model.layers.77.mlp.up_proj.weight": "model-00028-of-00034.safetensors", "model.layers.86.mlp.gate_proj.weight": "model-00028-of-00034.safetensors", "model.layers.77.mlp.gate_proj.weight": "model-00029-of-00034.safetensors", "model.layers.86.mlp.down_proj.weight": "model-00029-of-00034.safetensors", "model.layers.77.mlp.down_proj.weight": "model-00029-of-00034.safetensors", "model.layers.86.input_layernorm.weight": "model-00029-of-00034.safetensors", "model.layers.77.input_layernorm.weight": "model-00029-of-00034.safetensors", "model.layers.85.mlp.up_proj.weight": "model-00029-of-00034.safetensors", "model.layers.76.mlp.up_proj.weight": "model-00029-of-00034.safetensors", "model.layers.85.mlp.down_proj.weight": "model-00029-of-00034.safetensors", "model.layers.76.mlp.down_proj.weight": "model-00029-of-00034.safetensors", "model.layers.85.input_layernorm.weight": "model-00029-of-00034.safetensors", "model.layers.76.input_layernorm.weight": "model-00029-of-00034.safetensors", "model.layers.88.self_attn.o_proj.weight": "model-00029-of-00034.safetensors", "model.layers.79.self_attn.o_proj.weight": "model-00029-of-00034.safetensors", "model.layers.88.self_attn.v_proj.weight": "model-00029-of-00034.safetensors", "model.layers.79.self_attn.v_proj.weight": "model-00029-of-00034.safetensors", "model.layers.88.self_attn.k_proj.weight": "model-00029-of-00034.safetensors", "model.layers.79.self_attn.k_proj.weight": "model-00029-of-00034.safetensors", "model.layers.88.self_attn.k_norm.weight": "model-00029-of-00034.safetensors", "model.layers.79.self_attn.k_norm.weight": "model-00029-of-00034.safetensors", "model.layers.88.self_attn.q_proj.weight": "model-00029-of-00034.safetensors", "model.layers.79.self_attn.q_proj.weight": "model-00029-of-00034.safetensors", "model.layers.88.self_attn.q_norm.weight": "model-00029-of-00034.safetensors", "model.layers.79.self_attn.q_norm.weight": "model-00029-of-00034.safetensors", "model.layers.88.mlp.gate_proj.weight": "model-00029-of-00034.safetensors", "model.layers.79.mlp.gate_proj.weight": "model-00029-of-00034.safetensors", "model.layers.87.self_attn.o_proj.weight": "model-00029-of-00034.safetensors", "model.layers.78.self_attn.o_proj.weight": "model-00029-of-00034.safetensors", "model.layers.87.self_attn.v_proj.weight": "model-00029-of-00034.safetensors", "model.layers.78.self_attn.v_proj.weight": "model-00029-of-00034.safetensors", "model.layers.87.self_attn.k_proj.weight": "model-00029-of-00034.safetensors", "model.layers.78.self_attn.k_proj.weight": "model-00029-of-00034.safetensors", "model.layers.87.self_attn.q_proj.weight": "model-00029-of-00034.safetensors", "model.layers.78.self_attn.q_proj.weight": "model-00030-of-00034.safetensors", "model.layers.87.mlp.up_proj.weight": "model-00030-of-00034.safetensors", "model.layers.78.mlp.up_proj.weight": "model-00030-of-00034.safetensors", "model.layers.87.mlp.gate_proj.weight": "model-00030-of-00034.safetensors", "model.layers.78.mlp.gate_proj.weight": "model-00030-of-00034.safetensors", "model.layers.87.mlp.down_proj.weight": "model-00030-of-00034.safetensors", "model.layers.78.mlp.down_proj.weight": "model-00030-of-00034.safetensors", "model.layers.87.input_layernorm.weight": "model-00030-of-00034.safetensors", "model.layers.78.input_layernorm.weight": "model-00030-of-00034.safetensors", "model.layers.90.self_attn.k_norm.weight": "model-00030-of-00034.safetensors", "model.layers.90.self_attn.q_norm.weight": "model-00030-of-00034.safetensors", "model.layers.89.self_attn.o_proj.weight": "model-00030-of-00034.safetensors", "model.layers.89.self_attn.v_proj.weight": "model-00030-of-00034.safetensors", "model.layers.89.self_attn.k_proj.weight": "model-00030-of-00034.safetensors", "model.layers.89.self_attn.k_norm.weight": "model-00030-of-00034.safetensors", "model.layers.89.self_attn.q_proj.weight": "model-00030-of-00034.safetensors", "model.layers.89.self_attn.q_norm.weight": "model-00030-of-00034.safetensors", "model.layers.89.mlp.up_proj.weight": "model-00030-of-00034.safetensors", "model.layers.89.mlp.gate_proj.weight": "model-00030-of-00034.safetensors", "model.layers.89.mlp.down_proj.weight": "model-00030-of-00034.safetensors", "model.layers.89.input_layernorm.weight": "model-00030-of-00034.safetensors", "model.layers.88.mlp.up_proj.weight": "model-00030-of-00034.safetensors", "model.layers.79.mlp.up_proj.weight": "model-00031-of-00034.safetensors", "model.layers.88.mlp.down_proj.weight": "model-00031-of-00034.safetensors", "model.layers.79.mlp.down_proj.weight": "model-00031-of-00034.safetensors", "model.layers.88.input_layernorm.weight": "model-00031-of-00034.safetensors", "model.layers.79.input_layernorm.weight": "model-00031-of-00034.safetensors", "model.layers.91.self_attn.o_proj.weight": "model-00031-of-00034.safetensors", "model.layers.91.self_attn.v_proj.weight": "model-00031-of-00034.safetensors", "model.layers.91.self_attn.k_proj.weight": "model-00031-of-00034.safetensors", "model.layers.91.self_attn.k_norm.weight": "model-00031-of-00034.safetensors", "model.layers.91.self_attn.q_proj.weight": "model-00031-of-00034.safetensors", "model.layers.91.self_attn.q_norm.weight": "model-00031-of-00034.safetensors", "model.layers.91.mlp.gate_proj.weight": "model-00031-of-00034.safetensors", "model.layers.90.self_attn.o_proj.weight": "model-00031-of-00034.safetensors", "model.layers.90.self_attn.v_proj.weight": "model-00031-of-00034.safetensors", "model.layers.90.self_attn.k_proj.weight": "model-00031-of-00034.safetensors", "model.layers.90.self_attn.q_proj.weight": "model-00031-of-00034.safetensors", "model.layers.90.mlp.up_proj.weight": "model-00031-of-00034.safetensors", "model.layers.90.mlp.gate_proj.weight": "model-00031-of-00034.safetensors", "model.layers.90.mlp.down_proj.weight": "model-00031-of-00034.safetensors", "model.layers.90.input_layernorm.weight": "model-00031-of-00034.safetensors", "model.layers.93.self_attn.k_norm.weight": "model-00031-of-00034.safetensors", "model.layers.93.self_attn.q_norm.weight": "model-00031-of-00034.safetensors", "model.layers.92.self_attn.o_proj.weight": "model-00031-of-00034.safetensors", "model.layers.92.self_attn.v_proj.weight": "model-00031-of-00034.safetensors", "model.layers.92.self_attn.k_proj.weight": "model-00031-of-00034.safetensors", "model.layers.92.self_attn.k_norm.weight": "model-00031-of-00034.safetensors", "model.layers.92.self_attn.q_proj.weight": "model-00031-of-00034.safetensors", "model.layers.92.self_attn.q_norm.weight": "model-00031-of-00034.safetensors", "model.layers.92.mlp.up_proj.weight": "model-00031-of-00034.safetensors", "model.layers.92.mlp.gate_proj.weight": "model-00031-of-00034.safetensors", "model.layers.92.mlp.down_proj.weight": "model-00032-of-00034.safetensors", "model.layers.92.input_layernorm.weight": "model-00032-of-00034.safetensors", "model.layers.91.mlp.up_proj.weight": "model-00032-of-00034.safetensors", "model.layers.91.mlp.down_proj.weight": "model-00032-of-00034.safetensors", "model.layers.91.input_layernorm.weight": "model-00032-of-00034.safetensors", "model.layers.94.self_attn.o_proj.weight": "model-00032-of-00034.safetensors", "model.layers.94.self_attn.v_proj.weight": "model-00032-of-00034.safetensors", "model.layers.94.self_attn.k_proj.weight": "model-00032-of-00034.safetensors", "model.layers.94.self_attn.k_norm.weight": "model-00032-of-00034.safetensors", "model.layers.94.self_attn.q_proj.weight": "model-00032-of-00034.safetensors", "model.layers.94.self_attn.q_norm.weight": "model-00032-of-00034.safetensors", "model.layers.94.mlp.gate_proj.weight": "model-00032-of-00034.safetensors", "model.layers.93.self_attn.o_proj.weight": "model-00032-of-00034.safetensors", "model.layers.93.self_attn.v_proj.weight": "model-00032-of-00034.safetensors", "model.layers.93.self_attn.k_proj.weight": "model-00032-of-00034.safetensors", "model.layers.93.self_attn.q_proj.weight": "model-00032-of-00034.safetensors", "model.layers.93.mlp.up_proj.weight": "model-00032-of-00034.safetensors", "model.layers.93.mlp.gate_proj.weight": "model-00032-of-00034.safetensors", "model.layers.93.mlp.down_proj.weight": "model-00032-of-00034.safetensors", "model.layers.93.input_layernorm.weight": "model-00032-of-00034.safetensors", "model.layers.96.self_attn.k_norm.weight": "model-00032-of-00034.safetensors", "model.layers.96.self_attn.q_norm.weight": "model-00032-of-00034.safetensors", "model.layers.95.self_attn.o_proj.weight": "model-00032-of-00034.safetensors", "model.layers.95.self_attn.v_proj.weight": "model-00032-of-00034.safetensors", "model.layers.95.self_attn.k_proj.weight": "model-00032-of-00034.safetensors", "model.layers.95.self_attn.k_norm.weight": "model-00032-of-00034.safetensors", "model.layers.95.self_attn.q_proj.weight": "model-00032-of-00034.safetensors", "model.layers.95.self_attn.q_norm.weight": "model-00032-of-00034.safetensors", "model.layers.95.mlp.up_proj.weight": "model-00032-of-00034.safetensors", "model.layers.95.mlp.gate_proj.weight": "model-00032-of-00034.safetensors", "model.layers.95.mlp.down_proj.weight": "model-00033-of-00034.safetensors", "model.layers.95.input_layernorm.weight": "model-00033-of-00034.safetensors", "model.layers.94.mlp.up_proj.weight": "model-00033-of-00034.safetensors", "model.layers.94.mlp.down_proj.weight": "model-00033-of-00034.safetensors", "model.layers.94.input_layernorm.weight": "model-00033-of-00034.safetensors", "model.layers.97.self_attn.o_proj.weight": "model-00033-of-00034.safetensors", "model.layers.97.self_attn.v_proj.weight": "model-00033-of-00034.safetensors", "model.layers.97.self_attn.k_proj.weight": "model-00033-of-00034.safetensors", "model.layers.97.self_attn.k_norm.weight": "model-00033-of-00034.safetensors", "model.layers.97.self_attn.q_proj.weight": "model-00033-of-00034.safetensors", "model.layers.97.self_attn.q_norm.weight": "model-00033-of-00034.safetensors", "model.layers.97.mlp.gate_proj.weight": "model-00033-of-00034.safetensors", "model.layers.96.self_attn.o_proj.weight": "model-00033-of-00034.safetensors", "model.layers.96.self_attn.v_proj.weight": "model-00033-of-00034.safetensors", "model.layers.96.self_attn.k_proj.weight": "model-00033-of-00034.safetensors", "model.layers.96.self_attn.q_proj.weight": "model-00033-of-00034.safetensors", "model.layers.96.mlp.up_proj.weight": "model-00033-of-00034.safetensors", "model.layers.96.mlp.gate_proj.weight": "model-00033-of-00034.safetensors", "model.layers.96.mlp.down_proj.weight": "model-00033-of-00034.safetensors", "model.layers.96.input_layernorm.weight": "model-00033-of-00034.safetensors", "model.layers.99.self_attn.k_norm.weight": "model-00033-of-00034.safetensors", "model.layers.99.self_attn.q_norm.weight": "model-00033-of-00034.safetensors", "model.layers.98.self_attn.o_proj.weight": "model-00033-of-00034.safetensors", "model.layers.98.self_attn.v_proj.weight": "model-00033-of-00034.safetensors", "model.layers.98.self_attn.k_proj.weight": "model-00033-of-00034.safetensors", "model.layers.98.self_attn.k_norm.weight": "model-00033-of-00034.safetensors", "model.layers.98.self_attn.q_proj.weight": "model-00033-of-00034.safetensors", "model.layers.98.self_attn.q_norm.weight": "model-00033-of-00034.safetensors", "model.layers.98.mlp.up_proj.weight": "model-00033-of-00034.safetensors", "model.layers.98.mlp.gate_proj.weight": "model-00033-of-00034.safetensors", "model.layers.98.mlp.down_proj.weight": "model-00034-of-00034.safetensors", "model.layers.98.input_layernorm.weight": "model-00034-of-00034.safetensors", "model.layers.97.mlp.up_proj.weight": "model-00034-of-00034.safetensors", "model.layers.97.mlp.down_proj.weight": "model-00034-of-00034.safetensors", "model.layers.97.input_layernorm.weight": "model-00034-of-00034.safetensors", "model.norm.weight": "model-00034-of-00034.safetensors", "model.layers.99.self_attn.o_proj.weight": "model-00034-of-00034.safetensors", "model.layers.99.self_attn.v_proj.weight": "model-00034-of-00034.safetensors", "model.layers.99.self_attn.k_proj.weight": "model-00034-of-00034.safetensors", "model.layers.99.self_attn.q_proj.weight": "model-00034-of-00034.safetensors", "model.layers.99.mlp.up_proj.weight": "model-00034-of-00034.safetensors", "model.layers.99.mlp.gate_proj.weight": "model-00034-of-00034.safetensors", "model.layers.99.mlp.down_proj.weight": "model-00034-of-00034.safetensors", "model.layers.99.input_layernorm.weight": "model-00034-of-00034.safetensors"}}
special_tokens_map.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<BOS_TOKEN>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|END_OF_TURN_TOKEN|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "<PAD>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ }
23
+ }
tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0cc8a79eafcf1043fbfad77df083de446a61424b222284d602c4edee497ce1e4
3
+ size 12777405
tokenizer_config.json ADDED
@@ -0,0 +1,330 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": true,
3
+ "add_eos_token": false,
4
+ "add_prefix_space": false,
5
+ "added_tokens_decoder": {
6
+ "0": {
7
+ "content": "<PAD>",
8
+ "lstrip": false,
9
+ "normalized": false,
10
+ "rstrip": false,
11
+ "single_word": false,
12
+ "special": true
13
+ },
14
+ "1": {
15
+ "content": "<UNK>",
16
+ "lstrip": false,
17
+ "normalized": false,
18
+ "rstrip": false,
19
+ "single_word": false,
20
+ "special": true
21
+ },
22
+ "2": {
23
+ "content": "<CLS>",
24
+ "lstrip": false,
25
+ "normalized": false,
26
+ "rstrip": false,
27
+ "single_word": false,
28
+ "special": true
29
+ },
30
+ "3": {
31
+ "content": "<SEP>",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false,
36
+ "special": true
37
+ },
38
+ "4": {
39
+ "content": "<MASK_TOKEN>",
40
+ "lstrip": false,
41
+ "normalized": false,
42
+ "rstrip": false,
43
+ "single_word": false,
44
+ "special": true
45
+ },
46
+ "5": {
47
+ "content": "<BOS_TOKEN>",
48
+ "lstrip": false,
49
+ "normalized": false,
50
+ "rstrip": false,
51
+ "single_word": false,
52
+ "special": true
53
+ },
54
+ "6": {
55
+ "content": "<EOS_TOKEN>",
56
+ "lstrip": false,
57
+ "normalized": false,
58
+ "rstrip": false,
59
+ "single_word": false,
60
+ "special": true
61
+ },
62
+ "7": {
63
+ "content": "<EOP_TOKEN>",
64
+ "lstrip": false,
65
+ "normalized": false,
66
+ "rstrip": false,
67
+ "single_word": false,
68
+ "special": true
69
+ },
70
+ "255000": {
71
+ "content": "<|START_OF_TURN_TOKEN|>",
72
+ "lstrip": false,
73
+ "normalized": false,
74
+ "rstrip": false,
75
+ "single_word": false,
76
+ "special": false
77
+ },
78
+ "255001": {
79
+ "content": "<|END_OF_TURN_TOKEN|>",
80
+ "lstrip": false,
81
+ "normalized": false,
82
+ "rstrip": false,
83
+ "single_word": false,
84
+ "special": true
85
+ },
86
+ "255002": {
87
+ "content": "<|YES_TOKEN|>",
88
+ "lstrip": false,
89
+ "normalized": false,
90
+ "rstrip": false,
91
+ "single_word": false,
92
+ "special": false
93
+ },
94
+ "255003": {
95
+ "content": "<|NO_TOKEN|>",
96
+ "lstrip": false,
97
+ "normalized": false,
98
+ "rstrip": false,
99
+ "single_word": false,
100
+ "special": false
101
+ },
102
+ "255004": {
103
+ "content": "<|GOOD_TOKEN|>",
104
+ "lstrip": false,
105
+ "normalized": false,
106
+ "rstrip": false,
107
+ "single_word": false,
108
+ "special": false
109
+ },
110
+ "255005": {
111
+ "content": "<|BAD_TOKEN|>",
112
+ "lstrip": false,
113
+ "normalized": false,
114
+ "rstrip": false,
115
+ "single_word": false,
116
+ "special": false
117
+ },
118
+ "255006": {
119
+ "content": "<|USER_TOKEN|>",
120
+ "lstrip": false,
121
+ "normalized": false,
122
+ "rstrip": false,
123
+ "single_word": false,
124
+ "special": false
125
+ },
126
+ "255007": {
127
+ "content": "<|CHATBOT_TOKEN|>",
128
+ "lstrip": false,
129
+ "normalized": false,
130
+ "rstrip": false,
131
+ "single_word": false,
132
+ "special": false
133
+ },
134
+ "255008": {
135
+ "content": "<|SYSTEM_TOKEN|>",
136
+ "lstrip": false,
137
+ "normalized": false,
138
+ "rstrip": false,
139
+ "single_word": false,
140
+ "special": false
141
+ },
142
+ "255009": {
143
+ "content": "<|USER_0_TOKEN|>",
144
+ "lstrip": false,
145
+ "normalized": false,
146
+ "rstrip": false,
147
+ "single_word": false,
148
+ "special": false
149
+ },
150
+ "255010": {
151
+ "content": "<|USER_1_TOKEN|>",
152
+ "lstrip": false,
153
+ "normalized": false,
154
+ "rstrip": false,
155
+ "single_word": false,
156
+ "special": false
157
+ },
158
+ "255011": {
159
+ "content": "<|USER_2_TOKEN|>",
160
+ "lstrip": false,
161
+ "normalized": false,
162
+ "rstrip": false,
163
+ "single_word": false,
164
+ "special": false
165
+ },
166
+ "255012": {
167
+ "content": "<|USER_3_TOKEN|>",
168
+ "lstrip": false,
169
+ "normalized": false,
170
+ "rstrip": false,
171
+ "single_word": false,
172
+ "special": false
173
+ },
174
+ "255013": {
175
+ "content": "<|USER_4_TOKEN|>",
176
+ "lstrip": false,
177
+ "normalized": false,
178
+ "rstrip": false,
179
+ "single_word": false,
180
+ "special": false
181
+ },
182
+ "255014": {
183
+ "content": "<|USER_5_TOKEN|>",
184
+ "lstrip": false,
185
+ "normalized": false,
186
+ "rstrip": false,
187
+ "single_word": false,
188
+ "special": false
189
+ },
190
+ "255015": {
191
+ "content": "<|USER_6_TOKEN|>",
192
+ "lstrip": false,
193
+ "normalized": false,
194
+ "rstrip": false,
195
+ "single_word": false,
196
+ "special": false
197
+ },
198
+ "255016": {
199
+ "content": "<|USER_7_TOKEN|>",
200
+ "lstrip": false,
201
+ "normalized": false,
202
+ "rstrip": false,
203
+ "single_word": false,
204
+ "special": false
205
+ },
206
+ "255017": {
207
+ "content": "<|USER_8_TOKEN|>",
208
+ "lstrip": false,
209
+ "normalized": false,
210
+ "rstrip": false,
211
+ "single_word": false,
212
+ "special": false
213
+ },
214
+ "255018": {
215
+ "content": "<|USER_9_TOKEN|>",
216
+ "lstrip": false,
217
+ "normalized": false,
218
+ "rstrip": false,
219
+ "single_word": false,
220
+ "special": false
221
+ },
222
+ "255019": {
223
+ "content": "<|EXTRA_0_TOKEN|>",
224
+ "lstrip": false,
225
+ "normalized": false,
226
+ "rstrip": false,
227
+ "single_word": false,
228
+ "special": false
229
+ },
230
+ "255020": {
231
+ "content": "<|EXTRA_1_TOKEN|>",
232
+ "lstrip": false,
233
+ "normalized": false,
234
+ "rstrip": false,
235
+ "single_word": false,
236
+ "special": false
237
+ },
238
+ "255021": {
239
+ "content": "<|EXTRA_2_TOKEN|>",
240
+ "lstrip": false,
241
+ "normalized": false,
242
+ "rstrip": false,
243
+ "single_word": false,
244
+ "special": false
245
+ },
246
+ "255022": {
247
+ "content": "<|EXTRA_3_TOKEN|>",
248
+ "lstrip": false,
249
+ "normalized": false,
250
+ "rstrip": false,
251
+ "single_word": false,
252
+ "special": false
253
+ },
254
+ "255023": {
255
+ "content": "<|EXTRA_4_TOKEN|>",
256
+ "lstrip": false,
257
+ "normalized": false,
258
+ "rstrip": false,
259
+ "single_word": false,
260
+ "special": false
261
+ },
262
+ "255024": {
263
+ "content": "<|EXTRA_5_TOKEN|>",
264
+ "lstrip": false,
265
+ "normalized": false,
266
+ "rstrip": false,
267
+ "single_word": false,
268
+ "special": false
269
+ },
270
+ "255025": {
271
+ "content": "<|EXTRA_6_TOKEN|>",
272
+ "lstrip": false,
273
+ "normalized": false,
274
+ "rstrip": false,
275
+ "single_word": false,
276
+ "special": false
277
+ },
278
+ "255026": {
279
+ "content": "<|EXTRA_7_TOKEN|>",
280
+ "lstrip": false,
281
+ "normalized": false,
282
+ "rstrip": false,
283
+ "single_word": false,
284
+ "special": false
285
+ },
286
+ "255027": {
287
+ "content": "<|EXTRA_8_TOKEN|>",
288
+ "lstrip": false,
289
+ "normalized": false,
290
+ "rstrip": false,
291
+ "single_word": false,
292
+ "special": false
293
+ },
294
+ "255028": {
295
+ "content": "<|EXTRA_9_TOKEN|>",
296
+ "lstrip": false,
297
+ "normalized": false,
298
+ "rstrip": false,
299
+ "single_word": false,
300
+ "special": false
301
+ }
302
+ },
303
+ "bos_token": "<BOS_TOKEN>",
304
+ "chat_template": [
305
+ {
306
+ "name": "default",
307
+ "template": "{{ bos_token }}{% if messages[0]['role'] == 'system' %}{% set loop_messages = messages[1:] %}{% set system_message = messages[0]['content'] %}{% elif false == true %}{% set loop_messages = messages %}{% set system_message = 'You are Command-R, a brilliant, sophisticated, AI-assistant trained to assist human users by providing thorough responses. You are trained by Cohere.' %}{% else %}{% set loop_messages = messages %}{% set system_message = false %}{% endif %}{% if system_message != false %}{{ '<|START_OF_TURN_TOKEN|><|SYSTEM_TOKEN|>' + system_message + '<|END_OF_TURN_TOKEN|>' }}{% endif %}{% for message in loop_messages %}{% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}{{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}{% endif %}{% set content = message['content'] %}{% if message['role'] == 'user' %}{{ '<|START_OF_TURN_TOKEN|><|USER_TOKEN|>' + content.strip() + '<|END_OF_TURN_TOKEN|>' }}{% elif message['role'] == 'assistant' %}{{ '<|START_OF_TURN_TOKEN|><|CHATBOT_TOKEN|>' + content.strip() + '<|END_OF_TURN_TOKEN|>' }}{% endif %}{% endfor %}{% if add_generation_prompt %}{{ '<|START_OF_TURN_TOKEN|><|CHATBOT_TOKEN|>' }}{% endif %}"
308
+ },
309
+ {
310
+ "name": "tool_use",
311
+ "template": "{{ bos_token }}{% if messages[0]['role'] == 'system' %}{% set loop_messages = messages[1:] %}{% set system_message = messages[0]['content'] %}{% else %}{% set loop_messages = messages %}{% set system_message = '## Task and Context\\nYou help people answer their questions and other requests interactively. You will be asked a very wide array of requests on all kinds of topics. You will be equipped with a wide range of search engines or similar tools to help you, which you use to research your answer. You should focus on serving the user\\'s needs as best you can, which will be wide-ranging.\\n\\n## Style Guide\\nUnless the user asks for a different style of answer, you should answer in full sentences, using proper grammar and spelling.' %}{% endif %}{{ '<|START_OF_TURN_TOKEN|><|SYSTEM_TOKEN|>' }}{{ '# Safety Preamble' }}{{ '\nThe instructions in this section override those in the task description and style guide sections. Don\\'t answer questions that are harmful or immoral.' }}{{ '\n\n# System Preamble' }}{{ '\n## Basic Rules' }}{{ '\nYou are a powerful conversational AI trained by Cohere to help people. You are augmented by a number of tools, and your job is to use and consume the output of these tools to best help the user. You will see a conversation history between yourself and a user, ending with an utterance from the user. You will then see a specific instruction instructing you what kind of response to generate. When you answer the user\\'s requests, you cite your sources in your answers, according to those instructions.' }}{{ '\n\n# User Preamble' }}{{ '\n' + system_message }}{{'\n\n## Available Tools\nHere is a list of tools that you have available to you:\n\n'}}{% for tool in tools %}{% if loop.index0 != 0 %}{{ '\n\n'}}{% endif %}{{'```python\ndef ' + tool.name + '('}}{% for param_name, param_fields in tool.parameter_definitions.items() %}{% if loop.index0 != 0 %}{{ ', '}}{% endif %}{{param_name}}: {% if not param_fields.required %}{{'Optional[' + param_fields.type + '] = None'}}{% else %}{{ param_fields.type }}{% endif %}{% endfor %}{{ ') -> List[Dict]:\n \"\"\"'}}{{ tool.description }}{% if tool.parameter_definitions|length != 0 %}{{ '\n\n Args:\n '}}{% for param_name, param_fields in tool.parameter_definitions.items() %}{% if loop.index0 != 0 %}{{ '\n ' }}{% endif %}{{ param_name + ' ('}}{% if not param_fields.required %}{{'Optional[' + param_fields.type + ']'}}{% else %}{{ param_fields.type }}{% endif %}{{ '): ' + param_fields.description }}{% endfor %}{% endif %}{{ '\n \"\"\"\n pass\n```' }}{% endfor %}{{ '<|END_OF_TURN_TOKEN|>'}}{% for message in loop_messages %}{% set content = message['content'] %}{% if message['role'] == 'user' %}{{ '<|START_OF_TURN_TOKEN|><|USER_TOKEN|>' + content.strip() + '<|END_OF_TURN_TOKEN|>' }}{% elif message['role'] == 'system' %}{{ '<|START_OF_TURN_TOKEN|><|SYSTEM_TOKEN|>' + content.strip() + '<|END_OF_TURN_TOKEN|>' }}{% elif message['role'] == 'assistant' %}{{ '<|START_OF_TURN_TOKEN|><|CHATBOT_TOKEN|>' + content.strip() + '<|END_OF_TURN_TOKEN|>' }}{% endif %}{% endfor %}{{'<|START_OF_TURN_TOKEN|><|SYSTEM_TOKEN|>Write \\'Action:\\' followed by a json-formatted list of actions that you want to perform in order to produce a good response to the user\\'s last input. You can use any of the supplied tools any number of times, but you should aim to execute the minimum number of necessary actions for the input. You should use the `directly-answer` tool if calling the other tools is unnecessary. The list of actions you want to call should be formatted as a list of json objects, for example:\n```json\n[\n {\n \"tool_name\": title of the tool in the specification,\n \"parameters\": a dict of parameters to input into the tool as they are defined in the specs, or {} if it takes no parameters\n }\n]```<|END_OF_TURN_TOKEN|>'}}{% if add_generation_prompt %}{{ '<|START_OF_TURN_TOKEN|><|CHATBOT_TOKEN|>' }}{% endif %}"
312
+ },
313
+ {
314
+ "name": "rag",
315
+ "template": "{{ bos_token }}{% if messages[0]['role'] == 'system' %}{% set loop_messages = messages[1:] %}{% set system_message = messages[0]['content'] %}{% else %}{% set loop_messages = messages %}{% set system_message = '## Task and Context\\nYou help people answer their questions and other requests interactively. You will be asked a very wide array of requests on all kinds of topics. You will be equipped with a wide range of search engines or similar tools to help you, which you use to research your answer. You should focus on serving the user\\'s needs as best you can, which will be wide-ranging.\\n\\n## Style Guide\\nUnless the user asks for a different style of answer, you should answer in full sentences, using proper grammar and spelling.' %}{% endif %}{{ '<|START_OF_TURN_TOKEN|><|SYSTEM_TOKEN|>' }}{{ '# Safety Preamble' }}{{ '\nThe instructions in this section override those in the task description and style guide sections. Don\\'t answer questions that are harmful or immoral.' }}{{ '\n\n# System Preamble' }}{{ '\n## Basic Rules' }}{{ '\nYou are a powerful conversational AI trained by Cohere to help people. You are augmented by a number of tools, and your job is to use and consume the output of these tools to best help the user. You will see a conversation history between yourself and a user, ending with an utterance from the user. You will then see a specific instruction instructing you what kind of response to generate. When you answer the user\\'s requests, you cite your sources in your answers, according to those instructions.' }}{{ '\n\n# User Preamble' }}{{ '\n' + system_message }}{{ '<|END_OF_TURN_TOKEN|>'}}{% for message in loop_messages %}{% set content = message['content'] %}{% if message['role'] == 'user' %}{{ '<|START_OF_TURN_TOKEN|><|USER_TOKEN|>' + content.strip() + '<|END_OF_TURN_TOKEN|>' }}{% elif message['role'] == 'system' %}{{ '<|START_OF_TURN_TOKEN|><|SYSTEM_TOKEN|>' + content.strip() + '<|END_OF_TURN_TOKEN|>' }}{% elif message['role'] == 'assistant' %}{{ '<|START_OF_TURN_TOKEN|><|CHATBOT_TOKEN|>' + content.strip() + '<|END_OF_TURN_TOKEN|>' }}{% endif %}{% endfor %}{{ '<|START_OF_TURN_TOKEN|><|SYSTEM_TOKEN|>'}}{{ '<results>' }}{% for document in documents %}{{ '\nDocument: ' }}{{ loop.index0 }}\n{% for key, value in document.items() %}{{ key }}: {{value}}\n{% endfor %}{% endfor %}{{ '</results>'}}{{ '<|END_OF_TURN_TOKEN|><|START_OF_TURN_TOKEN|><|SYSTEM_TOKEN|>' }}{{ 'Carefully perform the following instructions, in order, starting each with a new line.\n' }}{{ 'Firstly, Decide which of the retrieved documents are relevant to the user\\'s last input by writing \\'Relevant Documents:\\' followed by comma-separated list of document numbers. If none are relevant, you should instead write \\'None\\'.\n' }}{{ 'Secondly, Decide which of the retrieved documents contain facts that should be cited in a good answer to the user\\'s last input by writing \\'Cited Documents:\\' followed a comma-separated list of document numbers. If you dont want to cite any of them, you should instead write \\'None\\'.\n' }}{% if citation_mode=='accurate' %}{{ 'Thirdly, Write \\'Answer:\\' followed by a response to the user\\'s last input in high quality natural english. Use the retrieved documents to help you. Do not insert any citations or grounding markup.\n' }}{% endif %}{{ 'Finally, Write \\'Grounded answer:\\' followed by a response to the user\\'s last input in high quality natural english. Use the symbols <co: doc> and </co: doc> to indicate when a fact comes from a document in the search result, e.g <co: 0>my fact</co: 0> for a fact from document 0.' }}{{ '<|END_OF_TURN_TOKEN|>' }}{% if add_generation_prompt %}{{ '<|START_OF_TURN_TOKEN|><|CHATBOT_TOKEN|>' }}{% endif %}"
316
+ }
317
+ ],
318
+ "clean_up_tokenization_spaces": false,
319
+ "eos_token": "<|END_OF_TURN_TOKEN|>",
320
+ "legacy": true,
321
+ "merges_file": null,
322
+ "model_max_length": 1000000000000000019884624838656,
323
+ "pad_token": "<PAD>",
324
+ "sp_model_kwargs": {},
325
+ "spaces_between_special_tokens": false,
326
+ "tokenizer_class": "CohereTokenizer",
327
+ "unk_token": null,
328
+ "use_default_system_prompt": false,
329
+ "vocab_file": null
330
+ }