bhenrym14 commited on
Commit
ffab578
1 Parent(s): 6420d86

Upload 23 files

Browse files
LICENSE ADDED
@@ -0,0 +1,323 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Yi Series Models License Agreement
2
+ Version: 2.0
3
+ Date of Release: November 4, 2023
4
+
5
+ 1. Definition
6
+
7
+ “Agreement” refers to the terms and conditions defined in this Yi Series Models
8
+ License Agreement for the use, reproduction and distribution of Yi Series
9
+ Models.
10
+
11
+ “Model” refers to associated components (including checkpoints) developed based
12
+ on machine learning, including learned weights and parameters (including the
13
+ status of optimizer).
14
+
15
+ “Yi Series Models” refers to opensource models with different specifications and
16
+ capabilities named “Yi” provided by the Licensor, including Yi-6B, Yi-34B etc.
17
+
18
+ “Derivatives” refers to all modifications to Yi Series Models, work based on Yi
19
+ Series Models, or any other models created or initialized by transferring the
20
+ weights, parameters, activations, or output patterns of Yi Series Models to
21
+ other models to achieve similar performance, including but not limited to
22
+ methods that require using intermediate data representations or generating
23
+ synthetic data based on Yi Series Models to train other models.
24
+
25
+ “Licensor” refers to Beijing Lingyiwanwu Information Technology Co., Ltd.
26
+
27
+ “you” refers to an individual or legal entity that exercises the license granted
28
+ by this Agreement and/or uses the Yi Series Models for any purpose and in any
29
+ field of use.
30
+
31
+ “Third Party” refers to any individuals, legal entities or non-legal
32
+ organizations other than you.
33
+
34
+ “Distribute” refers to transmitting, copying, publishing, or otherwise sharing
35
+ the Yi Series Models with third parties, including providing the Yi Series
36
+ Models through electronic or other remote means (such as any SaaS software or
37
+ PaaS software accessed via API or web access).
38
+
39
+ “Commercial Purposes” refers to the use of the Yi Series Models, directly or
40
+ indirectly, for the operation, promotion, revenue generation, or any other
41
+ profit-making purposes for entities or individuals.
42
+
43
+ “Laws and Regulations” refers to the laws and administrative regulations of the
44
+ mainland of the People's Republic of China (for the purposes of this Agreement
45
+ only, excluding Hong Kong, Macau, and Taiwan).
46
+
47
+ “Personal Information” refers to various information related to identified or
48
+ identifiable natural persons recorded electronically or by other means,
49
+ excluding information that has been anonymized.
50
+
51
+ “Logo” refers to any trademark, service mark, trade name, domain name, website
52
+ name, or other distinctive branding marks.
53
+
54
+
55
+ 2. License and License Restrictions
56
+
57
+ The Licensor hereby grants you a non-exclusive, global, non-transferable,
58
+ non-sub-licensable, revocable, and royalty-free copyright license. You must
59
+ adhere to the following license restrictions:
60
+
61
+ 1) Your use of the Yi Series Models must comply with the Laws and Regulations as
62
+ well as applicable legal requirements of other countries/regions, and respect
63
+ social ethics and moral standards, including but not limited to, not using the
64
+ Yi Series Models for purposes prohibited by Laws and Regulations as well as
65
+ applicable legal requirements of other countries/regions, such as harming
66
+ national security, promoting terrorism, extremism, inciting ethnic or racial
67
+ hatred, discrimination, violence, or pornography, and spreading false harmful
68
+ information.
69
+
70
+ 2) You shall not, for military or unlawful purposes or in ways not allowed by
71
+ Laws and Regulations as well as applicable legal requirements of other
72
+ countries/regions, a) use, copy or Distribute the Yi Series Models, or b) create
73
+ complete or partial Derivatives of the Yi Series Models.
74
+
75
+ 3) Your use of the Yi Series Models (including using the output of the Yi Series
76
+ Models) and the creation of Derivatives must not infringe upon the legitimate
77
+ rights of any Third Party, including but not limited to the rights of personal
78
+ rights such as the right to likeness, reputation, and privacy, as well as
79
+ intellectual property rights such as copyrights, patents, trade secrets, and
80
+ other property rights.
81
+
82
+ 4) You must clearly attribute the source of the Yi Series Models to the Licensor
83
+ and provide a copy of this Agreement to any Third-Party users of the Yi Series
84
+ Models and Derivatives.
85
+
86
+ 5) If you modify the Yi Series Models to create Derivatives, you must clearly
87
+ indicate the substantial modifications made, and these modifications shall not
88
+ violate the license restrictions of this Agreement. You shall not enable,
89
+ assist, or in any way facilitate Third Parties to violate the license
90
+ restrictions of this Agreement.
91
+
92
+ If you plan to use the Yi Series Models and Derivatives for Commercial Purposes,
93
+ you should contact the Licensor in advance as specified in Section 7 of this
94
+ Agreement named "Updates to the Agreement and Contact Information" and obtain
95
+ written authorization from the Licensor. When you obtain authorization from the
96
+ Licensor to use the Yi Series Models and Derivatives for Commercial Purposes,
97
+ you must comply with the afore-mentioned license restrictions.
98
+
99
+
100
+ 3. Intellectual Property
101
+
102
+ The ownership of the Yi Series Models and their related intellectual property
103
+ rights is solely held by the Licensor.
104
+
105
+ In any circumstance, without the prior written consent of the Licensor, you are
106
+ not allowed to use any Logo associated with the Licensor. If your use of
107
+ Licensor's Logo in violation of this Agreement causes any losses to the Licensor
108
+ or others, you will bear full legal responsibility.
109
+
110
+
111
+ 4. Disclaimer and Limitation of Liability
112
+
113
+ The Yi Series Models are provided "AS IS." The Licensor does not provide any
114
+ express or implied warranties for the Yi Series Models, including but not
115
+ limited to stability, ownership, merchantability, non-infringement, or fitness
116
+ for a specific purpose of the Yi Series Models and their output results. You
117
+ assume all responsibilities for the risks and consequences arising from the use,
118
+ reproduction, distribution of the Yi Series Models, and the creation of
119
+ Derivatives.
120
+
121
+ The Licensor complies with Laws and Regulations at all stages of model training,
122
+ maintaining the legality, authenticity, accuracy, objectivity, and diversity of
123
+ data and algorithms. The Licensor is not liable for any direct, indirect,
124
+ incidental consequences, and other losses or damages related to your use,
125
+ reproduction, and distribution of the Yi Series Models, and the creation of
126
+ Derivatives under this Agreement. This includes but is not limited to:
127
+
128
+ 1) The Licensor is not responsible for data security risks resulting from your
129
+ use of the Yi Series Models.
130
+
131
+ 2) The Yi Series Models may contain Personal Information. When you use Yi Series
132
+ Models, you acknowledge that you are the data processing entity as defined under
133
+ the Laws and Regulations responsible for determining the processing methods and
134
+ purposes of Personal Information. You must comply with legal requirements for
135
+ processing any Personal Information that may be contained in the Yi Series
136
+ Models and assume the associated legal responsibilities, as well as the risks
137
+ and consequences of processing Personal Information.
138
+
139
+ 3) The Licensor is not liable for reputation risks arising from your use of the
140
+ Yi Series Models or the output results of the Yi Series Models.
141
+
142
+ 4) The Licensor is not liable for intellectual property risks associated with
143
+ your use of the Yi Series Models’ output results.
144
+
145
+ If your use, reproduction, distribution of the Yi Series Models, or the creation
146
+ of Derivatives result in losses to the Licensor, the Licensor has the right to
147
+ seek compensation from you. For any claims made by Third Parties against the
148
+ Licensor related to your use, reproduction, and distribution of the Yi Series
149
+ Models, or the creation of Derivatives, the Licensor has the right to demand
150
+ that you defend, compensate, and indemnify the Licensor and protect the Licensor
151
+ from harm.
152
+
153
+
154
+ 5. Dispute Resolution
155
+
156
+ The stipulation, effectiveness, interpretation, performance, modification, and
157
+ termination of the Agreement, the use, copy and Distribute of the Yi Series
158
+ Models, and dispute resolution associated with your use, copy and distribution
159
+ shall be governed by the laws of the mainland of the People's Republic of China
160
+ (for the purposes of this agreement only, excluding Hong Kong, Macau, and
161
+ Taiwan), and the application of conflict of laws is excluded.
162
+
163
+ Any disputes arising from the use, copy or distribution of the Yi Series Models
164
+ should first be resolved through amicable negotiations. If negotiations fail,
165
+ legal proceedings should be initiated in the People's Court at the location of
166
+ the Licensor.
167
+
168
+
169
+ 6. Effectiveness and Termination of the Agreement
170
+
171
+ Your use of the Yi Series Models signifies that you have read and agreed to be
172
+ bound by the terms of the Agreement. The Agreement becomes effective from the
173
+ date of your use of the Yi Series Models and will terminate from the date you
174
+ cease using the Yi Series Models. If you violate any terms or restrictions in
175
+ the Agreement, the Licensor reserves the right to terminate the Agreement.
176
+
177
+ Upon termination of the Agreement, you must immediately cease using the Yi
178
+ Series Models. Section 4, "Disclaimer and Limitation of Liability," and Section
179
+ 5, "Dispute Resolution," of this Agreement remain in effect after the
180
+ termination of this Agreement.
181
+
182
+
183
+ 7. Updates to the Agreement and Contact Information
184
+
185
+ The Licensor reserves the right to update the Agreement from time to time. The
186
+ latest version of the Agreement will be posted by the Licensor through
187
+ https://01.ai.
188
+
189
+ For any questions related to licensing and copyright, please contact the
190
+ Licensor at yi@01.ai.
191
+
192
+
193
+ Yi系列模型许可协议
194
+ 版本: 2.0
195
+ 发布日期: 2023年11月4日
196
+
197
+ 1. 定义
198
+
199
+ “协议”是指本协议中定义Yi系列模型使用、复制和分发的条款和条件。
200
+
201
+ “模型”是指任何附带的基于机器学习的组件(包括检查点),包括学习的权重、参数(包括优
202
+ 化器状态)。
203
+
204
+ “Yi系列模型”是指许可方开源���以Yi命名的不同规格、不同能力的模型,包括
205
+ Yi-6B、Yi-34B等。
206
+
207
+ “模型衍生品”是指对Yi系列模型的所有修改、基于Yi系列模型的工作,或通过将Yi系列模型
208
+ 的权重、参数、激活或输出模式转移到其他模型而创建或初始化的任何其他模型,以使其他
209
+ 模型的性能与Yi系列模型类似,包括但不限于需要使用中间数据表示的提取方法或基于Yi系
210
+ 列模型生成合成数据来训练其他模型的方法。
211
+
212
+ “许可方”是指北京零一万物信息技术有限公司。
213
+
214
+ “您”是指行使本协议授予的权限和/或出于任何目的和在任何使用领域使用Yi系列模型的个
215
+ 人或法人实体。
216
+
217
+ “第三方”是指您之外的任何个人、法人实体或非法人组织。
218
+
219
+ “分发”是指向第三方传输、复制、发布或以其他方式共享Yi系列模型,包括将Yi系列模型作
220
+ 为通过电子或其他远程方式(例如基于 API 或 Web 访问的任何 SaaS 软件或 PaaS 软
221
+ 件)。
222
+
223
+ “商业用途”是指使用Yi系列模型,直接或间接为实体或个人进行运营、推广或产生收入,或
224
+ 用于任何其他盈利目的。
225
+
226
+ “法律法规”是指中华人民共和国大陆地区(仅为本协议之目的,不包括香港、澳门和台湾)
227
+ 的法律及行政法规。
228
+
229
+ “个人信息”是指以电子或者其他方式记录的与已识别或者可识别的自然人有关的各种信息,
230
+ 不包括匿名化处理后的信息。
231
+
232
+ “标识” 是指任何商标、服务标记、商号、域名、网站名称或其他带有显著品牌特征的标
233
+ 记。
234
+
235
+
236
+ 2. 许可及许可限制
237
+
238
+ 许可方特此授予您非排他性、全球性、不可转让、不可再许可、可撤销、免版税的版权许
239
+ 可。您必须满足如下许可限制条件:
240
+
241
+ 1) 您对Yi系列模型的使用应遵守法律法规以及其他国家/地区适用的法律要求、尊重社会公
242
+ 德和伦理道德。包括但不限于您不得将Yi系列模型用作危害国家安全、宣扬恐怖主义、极端
243
+ 主义,宣扬民族及种族仇恨、歧视,暴力、色情,以及虚假有害信息等法律法规以及其他国
244
+ 家/地区适用的法律要求禁止的目的。
245
+
246
+ 2) 您不得出于军事或非法目的,或以法律法规以及其他国家/地区适用的法律要求所不允许
247
+ 的方式a) 使用、复制、或分发Yi系列模型; 或b) 创建Yi系列模型的全部或部分衍生品。
248
+
249
+ 3) 您对Yi系列模型的使用(包括使用Yi系列模型的输出)以及模型衍生品的创建不得侵犯
250
+ 任何第三方的合法权益,包括但不限于他人肖像权、名誉权、隐私权等人格权,著作权、专
251
+ 利权、商业秘密等知识产权,或其他财产权益。
252
+
253
+ 4) 您必须向Yi系列模型及Yi系列模型衍生品的任何第三方使用者明确Yi系列模型的来源为
254
+ 许可方并向其提供本协议的副本。
255
+
256
+ 5) 若您修改Yi系列模型得到模型衍生品,您必须以显著的方式说明修改的内容,且上述修
257
+ 改不得违反本协议的许可限制条件,也不能允许、协助或以其他方式使得第三方违反本协议
258
+ 中的许可限制条件。
259
+
260
+ 如果您计划将 Yi系列模型及模型衍生品用作商业用途,您应当事先通过第7款“协议更新及
261
+ 联系方式”中的方式联系许可方进行登记并获得许可方的书面授权。若您取得许可方授权将
262
+ Yi系列模型及模型衍生品用作商业用途时,您应满足许可方上述许可限制条件。
263
+
264
+
265
+ 3. 知识产权
266
+
267
+ Yi系列模型的所有权及其相关知识产权,由许可方单独所有。
268
+
269
+ 在任何情况下,未经许可方事先书面同意,您不得以任何方式使用许可方的任何标识。由于
270
+ 您违反本协议使用许可方的标识给许可方或他人造成损失的,由您承担全部法律责任。
271
+
272
+
273
+ 4. 免责声明及责任限制
274
+
275
+ Yi系列模型按“原样”提供。许可方不对Yi系列模型提供任何明示或暗示的保证,包括但不限
276
+ 于:模型及输出结果的稳定性、所有权、适销性、非侵权性、或特定用途适用性。您将对适
277
+ 用、复制及分发Yi系列模型以及创建模型衍生品所产生的风险与后果承担所有责任。
278
+
279
+ 许可方在模型训练的所有阶段都遵守法律法规,坚持维护数据和算法的合法、真实、准确、
280
+ 客观和多样性。许可方不对您根据本协议使用、复制及分发Yi系列模型,以及创建模型衍生
281
+ 品而产生或与之相关的任何直接、间接、附带的后果、以及其他损失或损害承担责任。包括
282
+ 但不限于:
283
+
284
+ 1) 许可方不承担您因使用Yi系列模型而导致的数据安全风险。
285
+
286
+ 2) Yi系列模型中可能包含个人信息。在您使用Yi系列模型的过程中,您承认您为法律法规
287
+ 定义下决定个人信息处理方式和目的的个人信息处理者。您应遵守法律法规要求处理Yi系列
288
+ 模型中可能包含的个人信息,并承担相应的法律责任,以及处理个人信息的风险和后果。
289
+
290
+ 3) 许可方不承担您使用Yi系列模型或模型输出结果而产生的声誉风险。
291
+
292
+ 4) 许可方不承担您使用Yi系列模型的输出结果涉及的知识产权风险。
293
+
294
+ 若由于您对Yi系列模型的使用、复制或分发,或者创建模型衍生品而导致许可方遭受损失,
295
+ 许可方有权要求您对许可方的损失进行赔偿。对于任何第三方向许可方提出的因您使用、复
296
+ 制或分发Yi系列模型或创建模型衍生品行为的相关索赔,许可方有权要求您为许可方进行辩
297
+ 护、赔偿并使许可方免受损害。
298
+
299
+
300
+ 5. 争议解决
301
+
302
+ 协议的订立、效力、解释、履行、修改和终止,使用、复制和分发Yi系列模型以及争议解决
303
+ 均适用中华人民共和国大陆地区(仅为本协议之目的,不包括香港、澳门和台湾)法律,并
304
+ 排除冲突法的适用。
305
+
306
+ 因使用、复制和分发Yi系列模型而发生的任何争议,各方应首先通过友好协商的方式加以解
307
+ 决。协商不成时,应向许可方所在地人民法院提起诉讼。
308
+
309
+
310
+ 6. 协议的生效及终止
311
+
312
+ 您使用Yi系列模型即表示您已阅读并同意接受协议的约束。协议自您使用Yi系列模型之日起
313
+ 生效并将在您停止使用Yi系列模型之日起终止。若您违反协议中的任何条款或限制,许可方
314
+ 有权终止协议。
315
+
316
+ 若协议终止,您需立即停止使用Yi系列模型。本协议第4条“免责声明及责任限制”及第5条
317
+ “争议解决”在协议终止后仍有效。
318
+
319
+
320
+ 7. 协议更新及联系方式
321
+
322
+ 许可方有权对协议进行不时更新。许可方将通过https://01.ai公布协议最新版本。有关许
323
+ 可和版权的任何问题,请通过yi@01.ai 与许可方联系。
config.json ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "../../../basemodels/Yi-34B-200K-Llamafied",
3
+ "architectures": [
4
+ "LlamaForCausalLM"
5
+ ],
6
+ "attention_bias": false,
7
+ "bos_token_id": 1,
8
+ "eos_token_id": 2,
9
+ "hidden_act": "silu",
10
+ "hidden_size": 7168,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 20480,
13
+ "max_position_embeddings": 200000,
14
+ "model_type": "llama",
15
+ "num_attention_heads": 56,
16
+ "num_hidden_layers": 60,
17
+ "num_key_value_heads": 8,
18
+ "pad_token_id": 0,
19
+ "pretraining_tp": 1,
20
+ "rms_norm_eps": 1e-05,
21
+ "rope_scaling": null,
22
+ "rope_theta": 5000000.0,
23
+ "tie_word_embeddings": false,
24
+ "torch_dtype": "float16",
25
+ "transformers_version": "4.35.2",
26
+ "use_cache": true,
27
+ "vocab_size": 64000
28
+ }
generation_config.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 1,
4
+ "eos_token_id": 2,
5
+ "pad_token_id": 0,
6
+ "transformers_version": "4.35.2"
7
+ }
model-00001-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:de57598d8829b0f9dcf4687968a924fd80ab181e643e531b180ab59e330735c3
3
+ size 4793130720
model-00002-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:447cda7f8d82778efce0656f1643ced90029298819e47afe779b132fd633fe15
3
+ size 4756459680
model-00003-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e3b75013dccef1b1766b1fb502df0a1abab64f6be7fffb634aa32564aa139b85
3
+ size 4991370096
model-00004-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c68eac88f5780aabf73ad2f729d885029c84bf233d0cf04e1884f5a60a708a6f
3
+ size 4756459720
model-00005-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:88575757946414dd532f8ad0105676a38eada9c7e71c09bcd6c36720b2d716da
3
+ size 4756459720
model-00006-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f0298655fb528b64237cadb233c821f7ed42af6f14e24507eec1115df2747a0a
3
+ size 4991370120
model-00007-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:363290fb7427f34719db1c256f9a42dcc443e58b11d2ab186b59d710f8469bf6
3
+ size 4756459720
model-00008-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:92cfb1d53ba270835ebd6da614b8fcb37dcb22da03c7346fcff369a3c7252b2c
3
+ size 4756459720
model-00009-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:00e6c9b8d0ea1af3a0e22cd6afbebe6df3bae09b8b8599edac808c7ed9774838
3
+ size 4991370120
model-00010-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:958034fb2cdb9494521d73e075b3f89eb5a6bb24d270d6dad00779b5929a055d
3
+ size 4756459720
model-00011-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b1b86a03a6753a7b02017356001d52c4b19ce75fdcdac87cd41c8c8ddf27b4fc
3
+ size 4756459720
model-00012-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:baa88c18bf33cf768da028e3fb21bcaecdb37bcc9afff55702a6af3ec3b72dcc
3
+ size 4991370120
model-00013-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e07f816de481072f75cadf0cc3a2808ee82eef49ef5cd696718db8a23cead64e
3
+ size 4756459720
model-00014-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4d9e1f56f3ae3f8c5a515b463bee98020cafd87b19007192391cbb80ce73efe1
3
+ size 4756459720
model-00015-of-00015.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b71f68e999acdc1ca5622f30fc91bd094c5a82b23644af5241c5714a9d27e6c6
3
+ size 1211148848
model.safetensors.index.json ADDED
@@ -0,0 +1,550 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 68777834496
4
+ },
5
+ "weight_map": {
6
+ "lm_head.weight": "model-00015-of-00015.safetensors",
7
+ "model.embed_tokens.weight": "model-00001-of-00015.safetensors",
8
+ "model.layers.0.input_layernorm.weight": "model-00001-of-00015.safetensors",
9
+ "model.layers.0.mlp.down_proj.weight": "model-00001-of-00015.safetensors",
10
+ "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00015.safetensors",
11
+ "model.layers.0.mlp.up_proj.weight": "model-00001-of-00015.safetensors",
12
+ "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00015.safetensors",
13
+ "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00015.safetensors",
14
+ "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00015.safetensors",
15
+ "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00015.safetensors",
16
+ "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00015.safetensors",
17
+ "model.layers.1.input_layernorm.weight": "model-00001-of-00015.safetensors",
18
+ "model.layers.1.mlp.down_proj.weight": "model-00001-of-00015.safetensors",
19
+ "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00015.safetensors",
20
+ "model.layers.1.mlp.up_proj.weight": "model-00001-of-00015.safetensors",
21
+ "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00015.safetensors",
22
+ "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00015.safetensors",
23
+ "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00015.safetensors",
24
+ "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00015.safetensors",
25
+ "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00015.safetensors",
26
+ "model.layers.10.input_layernorm.weight": "model-00003-of-00015.safetensors",
27
+ "model.layers.10.mlp.down_proj.weight": "model-00003-of-00015.safetensors",
28
+ "model.layers.10.mlp.gate_proj.weight": "model-00003-of-00015.safetensors",
29
+ "model.layers.10.mlp.up_proj.weight": "model-00003-of-00015.safetensors",
30
+ "model.layers.10.post_attention_layernorm.weight": "model-00003-of-00015.safetensors",
31
+ "model.layers.10.self_attn.k_proj.weight": "model-00003-of-00015.safetensors",
32
+ "model.layers.10.self_attn.o_proj.weight": "model-00003-of-00015.safetensors",
33
+ "model.layers.10.self_attn.q_proj.weight": "model-00003-of-00015.safetensors",
34
+ "model.layers.10.self_attn.v_proj.weight": "model-00003-of-00015.safetensors",
35
+ "model.layers.11.input_layernorm.weight": "model-00003-of-00015.safetensors",
36
+ "model.layers.11.mlp.down_proj.weight": "model-00003-of-00015.safetensors",
37
+ "model.layers.11.mlp.gate_proj.weight": "model-00003-of-00015.safetensors",
38
+ "model.layers.11.mlp.up_proj.weight": "model-00003-of-00015.safetensors",
39
+ "model.layers.11.post_attention_layernorm.weight": "model-00003-of-00015.safetensors",
40
+ "model.layers.11.self_attn.k_proj.weight": "model-00003-of-00015.safetensors",
41
+ "model.layers.11.self_attn.o_proj.weight": "model-00003-of-00015.safetensors",
42
+ "model.layers.11.self_attn.q_proj.weight": "model-00003-of-00015.safetensors",
43
+ "model.layers.11.self_attn.v_proj.weight": "model-00003-of-00015.safetensors",
44
+ "model.layers.12.input_layernorm.weight": "model-00004-of-00015.safetensors",
45
+ "model.layers.12.mlp.down_proj.weight": "model-00004-of-00015.safetensors",
46
+ "model.layers.12.mlp.gate_proj.weight": "model-00004-of-00015.safetensors",
47
+ "model.layers.12.mlp.up_proj.weight": "model-00004-of-00015.safetensors",
48
+ "model.layers.12.post_attention_layernorm.weight": "model-00004-of-00015.safetensors",
49
+ "model.layers.12.self_attn.k_proj.weight": "model-00003-of-00015.safetensors",
50
+ "model.layers.12.self_attn.o_proj.weight": "model-00003-of-00015.safetensors",
51
+ "model.layers.12.self_attn.q_proj.weight": "model-00003-of-00015.safetensors",
52
+ "model.layers.12.self_attn.v_proj.weight": "model-00003-of-00015.safetensors",
53
+ "model.layers.13.input_layernorm.weight": "model-00004-of-00015.safetensors",
54
+ "model.layers.13.mlp.down_proj.weight": "model-00004-of-00015.safetensors",
55
+ "model.layers.13.mlp.gate_proj.weight": "model-00004-of-00015.safetensors",
56
+ "model.layers.13.mlp.up_proj.weight": "model-00004-of-00015.safetensors",
57
+ "model.layers.13.post_attention_layernorm.weight": "model-00004-of-00015.safetensors",
58
+ "model.layers.13.self_attn.k_proj.weight": "model-00004-of-00015.safetensors",
59
+ "model.layers.13.self_attn.o_proj.weight": "model-00004-of-00015.safetensors",
60
+ "model.layers.13.self_attn.q_proj.weight": "model-00004-of-00015.safetensors",
61
+ "model.layers.13.self_attn.v_proj.weight": "model-00004-of-00015.safetensors",
62
+ "model.layers.14.input_layernorm.weight": "model-00004-of-00015.safetensors",
63
+ "model.layers.14.mlp.down_proj.weight": "model-00004-of-00015.safetensors",
64
+ "model.layers.14.mlp.gate_proj.weight": "model-00004-of-00015.safetensors",
65
+ "model.layers.14.mlp.up_proj.weight": "model-00004-of-00015.safetensors",
66
+ "model.layers.14.post_attention_layernorm.weight": "model-00004-of-00015.safetensors",
67
+ "model.layers.14.self_attn.k_proj.weight": "model-00004-of-00015.safetensors",
68
+ "model.layers.14.self_attn.o_proj.weight": "model-00004-of-00015.safetensors",
69
+ "model.layers.14.self_attn.q_proj.weight": "model-00004-of-00015.safetensors",
70
+ "model.layers.14.self_attn.v_proj.weight": "model-00004-of-00015.safetensors",
71
+ "model.layers.15.input_layernorm.weight": "model-00004-of-00015.safetensors",
72
+ "model.layers.15.mlp.down_proj.weight": "model-00004-of-00015.safetensors",
73
+ "model.layers.15.mlp.gate_proj.weight": "model-00004-of-00015.safetensors",
74
+ "model.layers.15.mlp.up_proj.weight": "model-00004-of-00015.safetensors",
75
+ "model.layers.15.post_attention_layernorm.weight": "model-00004-of-00015.safetensors",
76
+ "model.layers.15.self_attn.k_proj.weight": "model-00004-of-00015.safetensors",
77
+ "model.layers.15.self_attn.o_proj.weight": "model-00004-of-00015.safetensors",
78
+ "model.layers.15.self_attn.q_proj.weight": "model-00004-of-00015.safetensors",
79
+ "model.layers.15.self_attn.v_proj.weight": "model-00004-of-00015.safetensors",
80
+ "model.layers.16.input_layernorm.weight": "model-00005-of-00015.safetensors",
81
+ "model.layers.16.mlp.down_proj.weight": "model-00005-of-00015.safetensors",
82
+ "model.layers.16.mlp.gate_proj.weight": "model-00004-of-00015.safetensors",
83
+ "model.layers.16.mlp.up_proj.weight": "model-00005-of-00015.safetensors",
84
+ "model.layers.16.post_attention_layernorm.weight": "model-00005-of-00015.safetensors",
85
+ "model.layers.16.self_attn.k_proj.weight": "model-00004-of-00015.safetensors",
86
+ "model.layers.16.self_attn.o_proj.weight": "model-00004-of-00015.safetensors",
87
+ "model.layers.16.self_attn.q_proj.weight": "model-00004-of-00015.safetensors",
88
+ "model.layers.16.self_attn.v_proj.weight": "model-00004-of-00015.safetensors",
89
+ "model.layers.17.input_layernorm.weight": "model-00005-of-00015.safetensors",
90
+ "model.layers.17.mlp.down_proj.weight": "model-00005-of-00015.safetensors",
91
+ "model.layers.17.mlp.gate_proj.weight": "model-00005-of-00015.safetensors",
92
+ "model.layers.17.mlp.up_proj.weight": "model-00005-of-00015.safetensors",
93
+ "model.layers.17.post_attention_layernorm.weight": "model-00005-of-00015.safetensors",
94
+ "model.layers.17.self_attn.k_proj.weight": "model-00005-of-00015.safetensors",
95
+ "model.layers.17.self_attn.o_proj.weight": "model-00005-of-00015.safetensors",
96
+ "model.layers.17.self_attn.q_proj.weight": "model-00005-of-00015.safetensors",
97
+ "model.layers.17.self_attn.v_proj.weight": "model-00005-of-00015.safetensors",
98
+ "model.layers.18.input_layernorm.weight": "model-00005-of-00015.safetensors",
99
+ "model.layers.18.mlp.down_proj.weight": "model-00005-of-00015.safetensors",
100
+ "model.layers.18.mlp.gate_proj.weight": "model-00005-of-00015.safetensors",
101
+ "model.layers.18.mlp.up_proj.weight": "model-00005-of-00015.safetensors",
102
+ "model.layers.18.post_attention_layernorm.weight": "model-00005-of-00015.safetensors",
103
+ "model.layers.18.self_attn.k_proj.weight": "model-00005-of-00015.safetensors",
104
+ "model.layers.18.self_attn.o_proj.weight": "model-00005-of-00015.safetensors",
105
+ "model.layers.18.self_attn.q_proj.weight": "model-00005-of-00015.safetensors",
106
+ "model.layers.18.self_attn.v_proj.weight": "model-00005-of-00015.safetensors",
107
+ "model.layers.19.input_layernorm.weight": "model-00005-of-00015.safetensors",
108
+ "model.layers.19.mlp.down_proj.weight": "model-00005-of-00015.safetensors",
109
+ "model.layers.19.mlp.gate_proj.weight": "model-00005-of-00015.safetensors",
110
+ "model.layers.19.mlp.up_proj.weight": "model-00005-of-00015.safetensors",
111
+ "model.layers.19.post_attention_layernorm.weight": "model-00005-of-00015.safetensors",
112
+ "model.layers.19.self_attn.k_proj.weight": "model-00005-of-00015.safetensors",
113
+ "model.layers.19.self_attn.o_proj.weight": "model-00005-of-00015.safetensors",
114
+ "model.layers.19.self_attn.q_proj.weight": "model-00005-of-00015.safetensors",
115
+ "model.layers.19.self_attn.v_proj.weight": "model-00005-of-00015.safetensors",
116
+ "model.layers.2.input_layernorm.weight": "model-00001-of-00015.safetensors",
117
+ "model.layers.2.mlp.down_proj.weight": "model-00001-of-00015.safetensors",
118
+ "model.layers.2.mlp.gate_proj.weight": "model-00001-of-00015.safetensors",
119
+ "model.layers.2.mlp.up_proj.weight": "model-00001-of-00015.safetensors",
120
+ "model.layers.2.post_attention_layernorm.weight": "model-00001-of-00015.safetensors",
121
+ "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00015.safetensors",
122
+ "model.layers.2.self_attn.o_proj.weight": "model-00001-of-00015.safetensors",
123
+ "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00015.safetensors",
124
+ "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00015.safetensors",
125
+ "model.layers.20.input_layernorm.weight": "model-00006-of-00015.safetensors",
126
+ "model.layers.20.mlp.down_proj.weight": "model-00006-of-00015.safetensors",
127
+ "model.layers.20.mlp.gate_proj.weight": "model-00005-of-00015.safetensors",
128
+ "model.layers.20.mlp.up_proj.weight": "model-00005-of-00015.safetensors",
129
+ "model.layers.20.post_attention_layernorm.weight": "model-00006-of-00015.safetensors",
130
+ "model.layers.20.self_attn.k_proj.weight": "model-00005-of-00015.safetensors",
131
+ "model.layers.20.self_attn.o_proj.weight": "model-00005-of-00015.safetensors",
132
+ "model.layers.20.self_attn.q_proj.weight": "model-00005-of-00015.safetensors",
133
+ "model.layers.20.self_attn.v_proj.weight": "model-00005-of-00015.safetensors",
134
+ "model.layers.21.input_layernorm.weight": "model-00006-of-00015.safetensors",
135
+ "model.layers.21.mlp.down_proj.weight": "model-00006-of-00015.safetensors",
136
+ "model.layers.21.mlp.gate_proj.weight": "model-00006-of-00015.safetensors",
137
+ "model.layers.21.mlp.up_proj.weight": "model-00006-of-00015.safetensors",
138
+ "model.layers.21.post_attention_layernorm.weight": "model-00006-of-00015.safetensors",
139
+ "model.layers.21.self_attn.k_proj.weight": "model-00006-of-00015.safetensors",
140
+ "model.layers.21.self_attn.o_proj.weight": "model-00006-of-00015.safetensors",
141
+ "model.layers.21.self_attn.q_proj.weight": "model-00006-of-00015.safetensors",
142
+ "model.layers.21.self_attn.v_proj.weight": "model-00006-of-00015.safetensors",
143
+ "model.layers.22.input_layernorm.weight": "model-00006-of-00015.safetensors",
144
+ "model.layers.22.mlp.down_proj.weight": "model-00006-of-00015.safetensors",
145
+ "model.layers.22.mlp.gate_proj.weight": "model-00006-of-00015.safetensors",
146
+ "model.layers.22.mlp.up_proj.weight": "model-00006-of-00015.safetensors",
147
+ "model.layers.22.post_attention_layernorm.weight": "model-00006-of-00015.safetensors",
148
+ "model.layers.22.self_attn.k_proj.weight": "model-00006-of-00015.safetensors",
149
+ "model.layers.22.self_attn.o_proj.weight": "model-00006-of-00015.safetensors",
150
+ "model.layers.22.self_attn.q_proj.weight": "model-00006-of-00015.safetensors",
151
+ "model.layers.22.self_attn.v_proj.weight": "model-00006-of-00015.safetensors",
152
+ "model.layers.23.input_layernorm.weight": "model-00006-of-00015.safetensors",
153
+ "model.layers.23.mlp.down_proj.weight": "model-00006-of-00015.safetensors",
154
+ "model.layers.23.mlp.gate_proj.weight": "model-00006-of-00015.safetensors",
155
+ "model.layers.23.mlp.up_proj.weight": "model-00006-of-00015.safetensors",
156
+ "model.layers.23.post_attention_layernorm.weight": "model-00006-of-00015.safetensors",
157
+ "model.layers.23.self_attn.k_proj.weight": "model-00006-of-00015.safetensors",
158
+ "model.layers.23.self_attn.o_proj.weight": "model-00006-of-00015.safetensors",
159
+ "model.layers.23.self_attn.q_proj.weight": "model-00006-of-00015.safetensors",
160
+ "model.layers.23.self_attn.v_proj.weight": "model-00006-of-00015.safetensors",
161
+ "model.layers.24.input_layernorm.weight": "model-00006-of-00015.safetensors",
162
+ "model.layers.24.mlp.down_proj.weight": "model-00006-of-00015.safetensors",
163
+ "model.layers.24.mlp.gate_proj.weight": "model-00006-of-00015.safetensors",
164
+ "model.layers.24.mlp.up_proj.weight": "model-00006-of-00015.safetensors",
165
+ "model.layers.24.post_attention_layernorm.weight": "model-00006-of-00015.safetensors",
166
+ "model.layers.24.self_attn.k_proj.weight": "model-00006-of-00015.safetensors",
167
+ "model.layers.24.self_attn.o_proj.weight": "model-00006-of-00015.safetensors",
168
+ "model.layers.24.self_attn.q_proj.weight": "model-00006-of-00015.safetensors",
169
+ "model.layers.24.self_attn.v_proj.weight": "model-00006-of-00015.safetensors",
170
+ "model.layers.25.input_layernorm.weight": "model-00007-of-00015.safetensors",
171
+ "model.layers.25.mlp.down_proj.weight": "model-00007-of-00015.safetensors",
172
+ "model.layers.25.mlp.gate_proj.weight": "model-00007-of-00015.safetensors",
173
+ "model.layers.25.mlp.up_proj.weight": "model-00007-of-00015.safetensors",
174
+ "model.layers.25.post_attention_layernorm.weight": "model-00007-of-00015.safetensors",
175
+ "model.layers.25.self_attn.k_proj.weight": "model-00006-of-00015.safetensors",
176
+ "model.layers.25.self_attn.o_proj.weight": "model-00006-of-00015.safetensors",
177
+ "model.layers.25.self_attn.q_proj.weight": "model-00006-of-00015.safetensors",
178
+ "model.layers.25.self_attn.v_proj.weight": "model-00006-of-00015.safetensors",
179
+ "model.layers.26.input_layernorm.weight": "model-00007-of-00015.safetensors",
180
+ "model.layers.26.mlp.down_proj.weight": "model-00007-of-00015.safetensors",
181
+ "model.layers.26.mlp.gate_proj.weight": "model-00007-of-00015.safetensors",
182
+ "model.layers.26.mlp.up_proj.weight": "model-00007-of-00015.safetensors",
183
+ "model.layers.26.post_attention_layernorm.weight": "model-00007-of-00015.safetensors",
184
+ "model.layers.26.self_attn.k_proj.weight": "model-00007-of-00015.safetensors",
185
+ "model.layers.26.self_attn.o_proj.weight": "model-00007-of-00015.safetensors",
186
+ "model.layers.26.self_attn.q_proj.weight": "model-00007-of-00015.safetensors",
187
+ "model.layers.26.self_attn.v_proj.weight": "model-00007-of-00015.safetensors",
188
+ "model.layers.27.input_layernorm.weight": "model-00007-of-00015.safetensors",
189
+ "model.layers.27.mlp.down_proj.weight": "model-00007-of-00015.safetensors",
190
+ "model.layers.27.mlp.gate_proj.weight": "model-00007-of-00015.safetensors",
191
+ "model.layers.27.mlp.up_proj.weight": "model-00007-of-00015.safetensors",
192
+ "model.layers.27.post_attention_layernorm.weight": "model-00007-of-00015.safetensors",
193
+ "model.layers.27.self_attn.k_proj.weight": "model-00007-of-00015.safetensors",
194
+ "model.layers.27.self_attn.o_proj.weight": "model-00007-of-00015.safetensors",
195
+ "model.layers.27.self_attn.q_proj.weight": "model-00007-of-00015.safetensors",
196
+ "model.layers.27.self_attn.v_proj.weight": "model-00007-of-00015.safetensors",
197
+ "model.layers.28.input_layernorm.weight": "model-00007-of-00015.safetensors",
198
+ "model.layers.28.mlp.down_proj.weight": "model-00007-of-00015.safetensors",
199
+ "model.layers.28.mlp.gate_proj.weight": "model-00007-of-00015.safetensors",
200
+ "model.layers.28.mlp.up_proj.weight": "model-00007-of-00015.safetensors",
201
+ "model.layers.28.post_attention_layernorm.weight": "model-00007-of-00015.safetensors",
202
+ "model.layers.28.self_attn.k_proj.weight": "model-00007-of-00015.safetensors",
203
+ "model.layers.28.self_attn.o_proj.weight": "model-00007-of-00015.safetensors",
204
+ "model.layers.28.self_attn.q_proj.weight": "model-00007-of-00015.safetensors",
205
+ "model.layers.28.self_attn.v_proj.weight": "model-00007-of-00015.safetensors",
206
+ "model.layers.29.input_layernorm.weight": "model-00008-of-00015.safetensors",
207
+ "model.layers.29.mlp.down_proj.weight": "model-00008-of-00015.safetensors",
208
+ "model.layers.29.mlp.gate_proj.weight": "model-00007-of-00015.safetensors",
209
+ "model.layers.29.mlp.up_proj.weight": "model-00008-of-00015.safetensors",
210
+ "model.layers.29.post_attention_layernorm.weight": "model-00008-of-00015.safetensors",
211
+ "model.layers.29.self_attn.k_proj.weight": "model-00007-of-00015.safetensors",
212
+ "model.layers.29.self_attn.o_proj.weight": "model-00007-of-00015.safetensors",
213
+ "model.layers.29.self_attn.q_proj.weight": "model-00007-of-00015.safetensors",
214
+ "model.layers.29.self_attn.v_proj.weight": "model-00007-of-00015.safetensors",
215
+ "model.layers.3.input_layernorm.weight": "model-00002-of-00015.safetensors",
216
+ "model.layers.3.mlp.down_proj.weight": "model-00002-of-00015.safetensors",
217
+ "model.layers.3.mlp.gate_proj.weight": "model-00001-of-00015.safetensors",
218
+ "model.layers.3.mlp.up_proj.weight": "model-00002-of-00015.safetensors",
219
+ "model.layers.3.post_attention_layernorm.weight": "model-00002-of-00015.safetensors",
220
+ "model.layers.3.self_attn.k_proj.weight": "model-00001-of-00015.safetensors",
221
+ "model.layers.3.self_attn.o_proj.weight": "model-00001-of-00015.safetensors",
222
+ "model.layers.3.self_attn.q_proj.weight": "model-00001-of-00015.safetensors",
223
+ "model.layers.3.self_attn.v_proj.weight": "model-00001-of-00015.safetensors",
224
+ "model.layers.30.input_layernorm.weight": "model-00008-of-00015.safetensors",
225
+ "model.layers.30.mlp.down_proj.weight": "model-00008-of-00015.safetensors",
226
+ "model.layers.30.mlp.gate_proj.weight": "model-00008-of-00015.safetensors",
227
+ "model.layers.30.mlp.up_proj.weight": "model-00008-of-00015.safetensors",
228
+ "model.layers.30.post_attention_layernorm.weight": "model-00008-of-00015.safetensors",
229
+ "model.layers.30.self_attn.k_proj.weight": "model-00008-of-00015.safetensors",
230
+ "model.layers.30.self_attn.o_proj.weight": "model-00008-of-00015.safetensors",
231
+ "model.layers.30.self_attn.q_proj.weight": "model-00008-of-00015.safetensors",
232
+ "model.layers.30.self_attn.v_proj.weight": "model-00008-of-00015.safetensors",
233
+ "model.layers.31.input_layernorm.weight": "model-00008-of-00015.safetensors",
234
+ "model.layers.31.mlp.down_proj.weight": "model-00008-of-00015.safetensors",
235
+ "model.layers.31.mlp.gate_proj.weight": "model-00008-of-00015.safetensors",
236
+ "model.layers.31.mlp.up_proj.weight": "model-00008-of-00015.safetensors",
237
+ "model.layers.31.post_attention_layernorm.weight": "model-00008-of-00015.safetensors",
238
+ "model.layers.31.self_attn.k_proj.weight": "model-00008-of-00015.safetensors",
239
+ "model.layers.31.self_attn.o_proj.weight": "model-00008-of-00015.safetensors",
240
+ "model.layers.31.self_attn.q_proj.weight": "model-00008-of-00015.safetensors",
241
+ "model.layers.31.self_attn.v_proj.weight": "model-00008-of-00015.safetensors",
242
+ "model.layers.32.input_layernorm.weight": "model-00008-of-00015.safetensors",
243
+ "model.layers.32.mlp.down_proj.weight": "model-00008-of-00015.safetensors",
244
+ "model.layers.32.mlp.gate_proj.weight": "model-00008-of-00015.safetensors",
245
+ "model.layers.32.mlp.up_proj.weight": "model-00008-of-00015.safetensors",
246
+ "model.layers.32.post_attention_layernorm.weight": "model-00008-of-00015.safetensors",
247
+ "model.layers.32.self_attn.k_proj.weight": "model-00008-of-00015.safetensors",
248
+ "model.layers.32.self_attn.o_proj.weight": "model-00008-of-00015.safetensors",
249
+ "model.layers.32.self_attn.q_proj.weight": "model-00008-of-00015.safetensors",
250
+ "model.layers.32.self_attn.v_proj.weight": "model-00008-of-00015.safetensors",
251
+ "model.layers.33.input_layernorm.weight": "model-00009-of-00015.safetensors",
252
+ "model.layers.33.mlp.down_proj.weight": "model-00009-of-00015.safetensors",
253
+ "model.layers.33.mlp.gate_proj.weight": "model-00008-of-00015.safetensors",
254
+ "model.layers.33.mlp.up_proj.weight": "model-00008-of-00015.safetensors",
255
+ "model.layers.33.post_attention_layernorm.weight": "model-00009-of-00015.safetensors",
256
+ "model.layers.33.self_attn.k_proj.weight": "model-00008-of-00015.safetensors",
257
+ "model.layers.33.self_attn.o_proj.weight": "model-00008-of-00015.safetensors",
258
+ "model.layers.33.self_attn.q_proj.weight": "model-00008-of-00015.safetensors",
259
+ "model.layers.33.self_attn.v_proj.weight": "model-00008-of-00015.safetensors",
260
+ "model.layers.34.input_layernorm.weight": "model-00009-of-00015.safetensors",
261
+ "model.layers.34.mlp.down_proj.weight": "model-00009-of-00015.safetensors",
262
+ "model.layers.34.mlp.gate_proj.weight": "model-00009-of-00015.safetensors",
263
+ "model.layers.34.mlp.up_proj.weight": "model-00009-of-00015.safetensors",
264
+ "model.layers.34.post_attention_layernorm.weight": "model-00009-of-00015.safetensors",
265
+ "model.layers.34.self_attn.k_proj.weight": "model-00009-of-00015.safetensors",
266
+ "model.layers.34.self_attn.o_proj.weight": "model-00009-of-00015.safetensors",
267
+ "model.layers.34.self_attn.q_proj.weight": "model-00009-of-00015.safetensors",
268
+ "model.layers.34.self_attn.v_proj.weight": "model-00009-of-00015.safetensors",
269
+ "model.layers.35.input_layernorm.weight": "model-00009-of-00015.safetensors",
270
+ "model.layers.35.mlp.down_proj.weight": "model-00009-of-00015.safetensors",
271
+ "model.layers.35.mlp.gate_proj.weight": "model-00009-of-00015.safetensors",
272
+ "model.layers.35.mlp.up_proj.weight": "model-00009-of-00015.safetensors",
273
+ "model.layers.35.post_attention_layernorm.weight": "model-00009-of-00015.safetensors",
274
+ "model.layers.35.self_attn.k_proj.weight": "model-00009-of-00015.safetensors",
275
+ "model.layers.35.self_attn.o_proj.weight": "model-00009-of-00015.safetensors",
276
+ "model.layers.35.self_attn.q_proj.weight": "model-00009-of-00015.safetensors",
277
+ "model.layers.35.self_attn.v_proj.weight": "model-00009-of-00015.safetensors",
278
+ "model.layers.36.input_layernorm.weight": "model-00009-of-00015.safetensors",
279
+ "model.layers.36.mlp.down_proj.weight": "model-00009-of-00015.safetensors",
280
+ "model.layers.36.mlp.gate_proj.weight": "model-00009-of-00015.safetensors",
281
+ "model.layers.36.mlp.up_proj.weight": "model-00009-of-00015.safetensors",
282
+ "model.layers.36.post_attention_layernorm.weight": "model-00009-of-00015.safetensors",
283
+ "model.layers.36.self_attn.k_proj.weight": "model-00009-of-00015.safetensors",
284
+ "model.layers.36.self_attn.o_proj.weight": "model-00009-of-00015.safetensors",
285
+ "model.layers.36.self_attn.q_proj.weight": "model-00009-of-00015.safetensors",
286
+ "model.layers.36.self_attn.v_proj.weight": "model-00009-of-00015.safetensors",
287
+ "model.layers.37.input_layernorm.weight": "model-00009-of-00015.safetensors",
288
+ "model.layers.37.mlp.down_proj.weight": "model-00009-of-00015.safetensors",
289
+ "model.layers.37.mlp.gate_proj.weight": "model-00009-of-00015.safetensors",
290
+ "model.layers.37.mlp.up_proj.weight": "model-00009-of-00015.safetensors",
291
+ "model.layers.37.post_attention_layernorm.weight": "model-00009-of-00015.safetensors",
292
+ "model.layers.37.self_attn.k_proj.weight": "model-00009-of-00015.safetensors",
293
+ "model.layers.37.self_attn.o_proj.weight": "model-00009-of-00015.safetensors",
294
+ "model.layers.37.self_attn.q_proj.weight": "model-00009-of-00015.safetensors",
295
+ "model.layers.37.self_attn.v_proj.weight": "model-00009-of-00015.safetensors",
296
+ "model.layers.38.input_layernorm.weight": "model-00010-of-00015.safetensors",
297
+ "model.layers.38.mlp.down_proj.weight": "model-00010-of-00015.safetensors",
298
+ "model.layers.38.mlp.gate_proj.weight": "model-00010-of-00015.safetensors",
299
+ "model.layers.38.mlp.up_proj.weight": "model-00010-of-00015.safetensors",
300
+ "model.layers.38.post_attention_layernorm.weight": "model-00010-of-00015.safetensors",
301
+ "model.layers.38.self_attn.k_proj.weight": "model-00009-of-00015.safetensors",
302
+ "model.layers.38.self_attn.o_proj.weight": "model-00009-of-00015.safetensors",
303
+ "model.layers.38.self_attn.q_proj.weight": "model-00009-of-00015.safetensors",
304
+ "model.layers.38.self_attn.v_proj.weight": "model-00009-of-00015.safetensors",
305
+ "model.layers.39.input_layernorm.weight": "model-00010-of-00015.safetensors",
306
+ "model.layers.39.mlp.down_proj.weight": "model-00010-of-00015.safetensors",
307
+ "model.layers.39.mlp.gate_proj.weight": "model-00010-of-00015.safetensors",
308
+ "model.layers.39.mlp.up_proj.weight": "model-00010-of-00015.safetensors",
309
+ "model.layers.39.post_attention_layernorm.weight": "model-00010-of-00015.safetensors",
310
+ "model.layers.39.self_attn.k_proj.weight": "model-00010-of-00015.safetensors",
311
+ "model.layers.39.self_attn.o_proj.weight": "model-00010-of-00015.safetensors",
312
+ "model.layers.39.self_attn.q_proj.weight": "model-00010-of-00015.safetensors",
313
+ "model.layers.39.self_attn.v_proj.weight": "model-00010-of-00015.safetensors",
314
+ "model.layers.4.input_layernorm.weight": "model-00002-of-00015.safetensors",
315
+ "model.layers.4.mlp.down_proj.weight": "model-00002-of-00015.safetensors",
316
+ "model.layers.4.mlp.gate_proj.weight": "model-00002-of-00015.safetensors",
317
+ "model.layers.4.mlp.up_proj.weight": "model-00002-of-00015.safetensors",
318
+ "model.layers.4.post_attention_layernorm.weight": "model-00002-of-00015.safetensors",
319
+ "model.layers.4.self_attn.k_proj.weight": "model-00002-of-00015.safetensors",
320
+ "model.layers.4.self_attn.o_proj.weight": "model-00002-of-00015.safetensors",
321
+ "model.layers.4.self_attn.q_proj.weight": "model-00002-of-00015.safetensors",
322
+ "model.layers.4.self_attn.v_proj.weight": "model-00002-of-00015.safetensors",
323
+ "model.layers.40.input_layernorm.weight": "model-00010-of-00015.safetensors",
324
+ "model.layers.40.mlp.down_proj.weight": "model-00010-of-00015.safetensors",
325
+ "model.layers.40.mlp.gate_proj.weight": "model-00010-of-00015.safetensors",
326
+ "model.layers.40.mlp.up_proj.weight": "model-00010-of-00015.safetensors",
327
+ "model.layers.40.post_attention_layernorm.weight": "model-00010-of-00015.safetensors",
328
+ "model.layers.40.self_attn.k_proj.weight": "model-00010-of-00015.safetensors",
329
+ "model.layers.40.self_attn.o_proj.weight": "model-00010-of-00015.safetensors",
330
+ "model.layers.40.self_attn.q_proj.weight": "model-00010-of-00015.safetensors",
331
+ "model.layers.40.self_attn.v_proj.weight": "model-00010-of-00015.safetensors",
332
+ "model.layers.41.input_layernorm.weight": "model-00010-of-00015.safetensors",
333
+ "model.layers.41.mlp.down_proj.weight": "model-00010-of-00015.safetensors",
334
+ "model.layers.41.mlp.gate_proj.weight": "model-00010-of-00015.safetensors",
335
+ "model.layers.41.mlp.up_proj.weight": "model-00010-of-00015.safetensors",
336
+ "model.layers.41.post_attention_layernorm.weight": "model-00010-of-00015.safetensors",
337
+ "model.layers.41.self_attn.k_proj.weight": "model-00010-of-00015.safetensors",
338
+ "model.layers.41.self_attn.o_proj.weight": "model-00010-of-00015.safetensors",
339
+ "model.layers.41.self_attn.q_proj.weight": "model-00010-of-00015.safetensors",
340
+ "model.layers.41.self_attn.v_proj.weight": "model-00010-of-00015.safetensors",
341
+ "model.layers.42.input_layernorm.weight": "model-00011-of-00015.safetensors",
342
+ "model.layers.42.mlp.down_proj.weight": "model-00011-of-00015.safetensors",
343
+ "model.layers.42.mlp.gate_proj.weight": "model-00010-of-00015.safetensors",
344
+ "model.layers.42.mlp.up_proj.weight": "model-00011-of-00015.safetensors",
345
+ "model.layers.42.post_attention_layernorm.weight": "model-00011-of-00015.safetensors",
346
+ "model.layers.42.self_attn.k_proj.weight": "model-00010-of-00015.safetensors",
347
+ "model.layers.42.self_attn.o_proj.weight": "model-00010-of-00015.safetensors",
348
+ "model.layers.42.self_attn.q_proj.weight": "model-00010-of-00015.safetensors",
349
+ "model.layers.42.self_attn.v_proj.weight": "model-00010-of-00015.safetensors",
350
+ "model.layers.43.input_layernorm.weight": "model-00011-of-00015.safetensors",
351
+ "model.layers.43.mlp.down_proj.weight": "model-00011-of-00015.safetensors",
352
+ "model.layers.43.mlp.gate_proj.weight": "model-00011-of-00015.safetensors",
353
+ "model.layers.43.mlp.up_proj.weight": "model-00011-of-00015.safetensors",
354
+ "model.layers.43.post_attention_layernorm.weight": "model-00011-of-00015.safetensors",
355
+ "model.layers.43.self_attn.k_proj.weight": "model-00011-of-00015.safetensors",
356
+ "model.layers.43.self_attn.o_proj.weight": "model-00011-of-00015.safetensors",
357
+ "model.layers.43.self_attn.q_proj.weight": "model-00011-of-00015.safetensors",
358
+ "model.layers.43.self_attn.v_proj.weight": "model-00011-of-00015.safetensors",
359
+ "model.layers.44.input_layernorm.weight": "model-00011-of-00015.safetensors",
360
+ "model.layers.44.mlp.down_proj.weight": "model-00011-of-00015.safetensors",
361
+ "model.layers.44.mlp.gate_proj.weight": "model-00011-of-00015.safetensors",
362
+ "model.layers.44.mlp.up_proj.weight": "model-00011-of-00015.safetensors",
363
+ "model.layers.44.post_attention_layernorm.weight": "model-00011-of-00015.safetensors",
364
+ "model.layers.44.self_attn.k_proj.weight": "model-00011-of-00015.safetensors",
365
+ "model.layers.44.self_attn.o_proj.weight": "model-00011-of-00015.safetensors",
366
+ "model.layers.44.self_attn.q_proj.weight": "model-00011-of-00015.safetensors",
367
+ "model.layers.44.self_attn.v_proj.weight": "model-00011-of-00015.safetensors",
368
+ "model.layers.45.input_layernorm.weight": "model-00011-of-00015.safetensors",
369
+ "model.layers.45.mlp.down_proj.weight": "model-00011-of-00015.safetensors",
370
+ "model.layers.45.mlp.gate_proj.weight": "model-00011-of-00015.safetensors",
371
+ "model.layers.45.mlp.up_proj.weight": "model-00011-of-00015.safetensors",
372
+ "model.layers.45.post_attention_layernorm.weight": "model-00011-of-00015.safetensors",
373
+ "model.layers.45.self_attn.k_proj.weight": "model-00011-of-00015.safetensors",
374
+ "model.layers.45.self_attn.o_proj.weight": "model-00011-of-00015.safetensors",
375
+ "model.layers.45.self_attn.q_proj.weight": "model-00011-of-00015.safetensors",
376
+ "model.layers.45.self_attn.v_proj.weight": "model-00011-of-00015.safetensors",
377
+ "model.layers.46.input_layernorm.weight": "model-00012-of-00015.safetensors",
378
+ "model.layers.46.mlp.down_proj.weight": "model-00012-of-00015.safetensors",
379
+ "model.layers.46.mlp.gate_proj.weight": "model-00011-of-00015.safetensors",
380
+ "model.layers.46.mlp.up_proj.weight": "model-00011-of-00015.safetensors",
381
+ "model.layers.46.post_attention_layernorm.weight": "model-00012-of-00015.safetensors",
382
+ "model.layers.46.self_attn.k_proj.weight": "model-00011-of-00015.safetensors",
383
+ "model.layers.46.self_attn.o_proj.weight": "model-00011-of-00015.safetensors",
384
+ "model.layers.46.self_attn.q_proj.weight": "model-00011-of-00015.safetensors",
385
+ "model.layers.46.self_attn.v_proj.weight": "model-00011-of-00015.safetensors",
386
+ "model.layers.47.input_layernorm.weight": "model-00012-of-00015.safetensors",
387
+ "model.layers.47.mlp.down_proj.weight": "model-00012-of-00015.safetensors",
388
+ "model.layers.47.mlp.gate_proj.weight": "model-00012-of-00015.safetensors",
389
+ "model.layers.47.mlp.up_proj.weight": "model-00012-of-00015.safetensors",
390
+ "model.layers.47.post_attention_layernorm.weight": "model-00012-of-00015.safetensors",
391
+ "model.layers.47.self_attn.k_proj.weight": "model-00012-of-00015.safetensors",
392
+ "model.layers.47.self_attn.o_proj.weight": "model-00012-of-00015.safetensors",
393
+ "model.layers.47.self_attn.q_proj.weight": "model-00012-of-00015.safetensors",
394
+ "model.layers.47.self_attn.v_proj.weight": "model-00012-of-00015.safetensors",
395
+ "model.layers.48.input_layernorm.weight": "model-00012-of-00015.safetensors",
396
+ "model.layers.48.mlp.down_proj.weight": "model-00012-of-00015.safetensors",
397
+ "model.layers.48.mlp.gate_proj.weight": "model-00012-of-00015.safetensors",
398
+ "model.layers.48.mlp.up_proj.weight": "model-00012-of-00015.safetensors",
399
+ "model.layers.48.post_attention_layernorm.weight": "model-00012-of-00015.safetensors",
400
+ "model.layers.48.self_attn.k_proj.weight": "model-00012-of-00015.safetensors",
401
+ "model.layers.48.self_attn.o_proj.weight": "model-00012-of-00015.safetensors",
402
+ "model.layers.48.self_attn.q_proj.weight": "model-00012-of-00015.safetensors",
403
+ "model.layers.48.self_attn.v_proj.weight": "model-00012-of-00015.safetensors",
404
+ "model.layers.49.input_layernorm.weight": "model-00012-of-00015.safetensors",
405
+ "model.layers.49.mlp.down_proj.weight": "model-00012-of-00015.safetensors",
406
+ "model.layers.49.mlp.gate_proj.weight": "model-00012-of-00015.safetensors",
407
+ "model.layers.49.mlp.up_proj.weight": "model-00012-of-00015.safetensors",
408
+ "model.layers.49.post_attention_layernorm.weight": "model-00012-of-00015.safetensors",
409
+ "model.layers.49.self_attn.k_proj.weight": "model-00012-of-00015.safetensors",
410
+ "model.layers.49.self_attn.o_proj.weight": "model-00012-of-00015.safetensors",
411
+ "model.layers.49.self_attn.q_proj.weight": "model-00012-of-00015.safetensors",
412
+ "model.layers.49.self_attn.v_proj.weight": "model-00012-of-00015.safetensors",
413
+ "model.layers.5.input_layernorm.weight": "model-00002-of-00015.safetensors",
414
+ "model.layers.5.mlp.down_proj.weight": "model-00002-of-00015.safetensors",
415
+ "model.layers.5.mlp.gate_proj.weight": "model-00002-of-00015.safetensors",
416
+ "model.layers.5.mlp.up_proj.weight": "model-00002-of-00015.safetensors",
417
+ "model.layers.5.post_attention_layernorm.weight": "model-00002-of-00015.safetensors",
418
+ "model.layers.5.self_attn.k_proj.weight": "model-00002-of-00015.safetensors",
419
+ "model.layers.5.self_attn.o_proj.weight": "model-00002-of-00015.safetensors",
420
+ "model.layers.5.self_attn.q_proj.weight": "model-00002-of-00015.safetensors",
421
+ "model.layers.5.self_attn.v_proj.weight": "model-00002-of-00015.safetensors",
422
+ "model.layers.50.input_layernorm.weight": "model-00012-of-00015.safetensors",
423
+ "model.layers.50.mlp.down_proj.weight": "model-00012-of-00015.safetensors",
424
+ "model.layers.50.mlp.gate_proj.weight": "model-00012-of-00015.safetensors",
425
+ "model.layers.50.mlp.up_proj.weight": "model-00012-of-00015.safetensors",
426
+ "model.layers.50.post_attention_layernorm.weight": "model-00012-of-00015.safetensors",
427
+ "model.layers.50.self_attn.k_proj.weight": "model-00012-of-00015.safetensors",
428
+ "model.layers.50.self_attn.o_proj.weight": "model-00012-of-00015.safetensors",
429
+ "model.layers.50.self_attn.q_proj.weight": "model-00012-of-00015.safetensors",
430
+ "model.layers.50.self_attn.v_proj.weight": "model-00012-of-00015.safetensors",
431
+ "model.layers.51.input_layernorm.weight": "model-00013-of-00015.safetensors",
432
+ "model.layers.51.mlp.down_proj.weight": "model-00013-of-00015.safetensors",
433
+ "model.layers.51.mlp.gate_proj.weight": "model-00013-of-00015.safetensors",
434
+ "model.layers.51.mlp.up_proj.weight": "model-00013-of-00015.safetensors",
435
+ "model.layers.51.post_attention_layernorm.weight": "model-00013-of-00015.safetensors",
436
+ "model.layers.51.self_attn.k_proj.weight": "model-00012-of-00015.safetensors",
437
+ "model.layers.51.self_attn.o_proj.weight": "model-00012-of-00015.safetensors",
438
+ "model.layers.51.self_attn.q_proj.weight": "model-00012-of-00015.safetensors",
439
+ "model.layers.51.self_attn.v_proj.weight": "model-00012-of-00015.safetensors",
440
+ "model.layers.52.input_layernorm.weight": "model-00013-of-00015.safetensors",
441
+ "model.layers.52.mlp.down_proj.weight": "model-00013-of-00015.safetensors",
442
+ "model.layers.52.mlp.gate_proj.weight": "model-00013-of-00015.safetensors",
443
+ "model.layers.52.mlp.up_proj.weight": "model-00013-of-00015.safetensors",
444
+ "model.layers.52.post_attention_layernorm.weight": "model-00013-of-00015.safetensors",
445
+ "model.layers.52.self_attn.k_proj.weight": "model-00013-of-00015.safetensors",
446
+ "model.layers.52.self_attn.o_proj.weight": "model-00013-of-00015.safetensors",
447
+ "model.layers.52.self_attn.q_proj.weight": "model-00013-of-00015.safetensors",
448
+ "model.layers.52.self_attn.v_proj.weight": "model-00013-of-00015.safetensors",
449
+ "model.layers.53.input_layernorm.weight": "model-00013-of-00015.safetensors",
450
+ "model.layers.53.mlp.down_proj.weight": "model-00013-of-00015.safetensors",
451
+ "model.layers.53.mlp.gate_proj.weight": "model-00013-of-00015.safetensors",
452
+ "model.layers.53.mlp.up_proj.weight": "model-00013-of-00015.safetensors",
453
+ "model.layers.53.post_attention_layernorm.weight": "model-00013-of-00015.safetensors",
454
+ "model.layers.53.self_attn.k_proj.weight": "model-00013-of-00015.safetensors",
455
+ "model.layers.53.self_attn.o_proj.weight": "model-00013-of-00015.safetensors",
456
+ "model.layers.53.self_attn.q_proj.weight": "model-00013-of-00015.safetensors",
457
+ "model.layers.53.self_attn.v_proj.weight": "model-00013-of-00015.safetensors",
458
+ "model.layers.54.input_layernorm.weight": "model-00013-of-00015.safetensors",
459
+ "model.layers.54.mlp.down_proj.weight": "model-00013-of-00015.safetensors",
460
+ "model.layers.54.mlp.gate_proj.weight": "model-00013-of-00015.safetensors",
461
+ "model.layers.54.mlp.up_proj.weight": "model-00013-of-00015.safetensors",
462
+ "model.layers.54.post_attention_layernorm.weight": "model-00013-of-00015.safetensors",
463
+ "model.layers.54.self_attn.k_proj.weight": "model-00013-of-00015.safetensors",
464
+ "model.layers.54.self_attn.o_proj.weight": "model-00013-of-00015.safetensors",
465
+ "model.layers.54.self_attn.q_proj.weight": "model-00013-of-00015.safetensors",
466
+ "model.layers.54.self_attn.v_proj.weight": "model-00013-of-00015.safetensors",
467
+ "model.layers.55.input_layernorm.weight": "model-00014-of-00015.safetensors",
468
+ "model.layers.55.mlp.down_proj.weight": "model-00014-of-00015.safetensors",
469
+ "model.layers.55.mlp.gate_proj.weight": "model-00013-of-00015.safetensors",
470
+ "model.layers.55.mlp.up_proj.weight": "model-00014-of-00015.safetensors",
471
+ "model.layers.55.post_attention_layernorm.weight": "model-00014-of-00015.safetensors",
472
+ "model.layers.55.self_attn.k_proj.weight": "model-00013-of-00015.safetensors",
473
+ "model.layers.55.self_attn.o_proj.weight": "model-00013-of-00015.safetensors",
474
+ "model.layers.55.self_attn.q_proj.weight": "model-00013-of-00015.safetensors",
475
+ "model.layers.55.self_attn.v_proj.weight": "model-00013-of-00015.safetensors",
476
+ "model.layers.56.input_layernorm.weight": "model-00014-of-00015.safetensors",
477
+ "model.layers.56.mlp.down_proj.weight": "model-00014-of-00015.safetensors",
478
+ "model.layers.56.mlp.gate_proj.weight": "model-00014-of-00015.safetensors",
479
+ "model.layers.56.mlp.up_proj.weight": "model-00014-of-00015.safetensors",
480
+ "model.layers.56.post_attention_layernorm.weight": "model-00014-of-00015.safetensors",
481
+ "model.layers.56.self_attn.k_proj.weight": "model-00014-of-00015.safetensors",
482
+ "model.layers.56.self_attn.o_proj.weight": "model-00014-of-00015.safetensors",
483
+ "model.layers.56.self_attn.q_proj.weight": "model-00014-of-00015.safetensors",
484
+ "model.layers.56.self_attn.v_proj.weight": "model-00014-of-00015.safetensors",
485
+ "model.layers.57.input_layernorm.weight": "model-00014-of-00015.safetensors",
486
+ "model.layers.57.mlp.down_proj.weight": "model-00014-of-00015.safetensors",
487
+ "model.layers.57.mlp.gate_proj.weight": "model-00014-of-00015.safetensors",
488
+ "model.layers.57.mlp.up_proj.weight": "model-00014-of-00015.safetensors",
489
+ "model.layers.57.post_attention_layernorm.weight": "model-00014-of-00015.safetensors",
490
+ "model.layers.57.self_attn.k_proj.weight": "model-00014-of-00015.safetensors",
491
+ "model.layers.57.self_attn.o_proj.weight": "model-00014-of-00015.safetensors",
492
+ "model.layers.57.self_attn.q_proj.weight": "model-00014-of-00015.safetensors",
493
+ "model.layers.57.self_attn.v_proj.weight": "model-00014-of-00015.safetensors",
494
+ "model.layers.58.input_layernorm.weight": "model-00014-of-00015.safetensors",
495
+ "model.layers.58.mlp.down_proj.weight": "model-00014-of-00015.safetensors",
496
+ "model.layers.58.mlp.gate_proj.weight": "model-00014-of-00015.safetensors",
497
+ "model.layers.58.mlp.up_proj.weight": "model-00014-of-00015.safetensors",
498
+ "model.layers.58.post_attention_layernorm.weight": "model-00014-of-00015.safetensors",
499
+ "model.layers.58.self_attn.k_proj.weight": "model-00014-of-00015.safetensors",
500
+ "model.layers.58.self_attn.o_proj.weight": "model-00014-of-00015.safetensors",
501
+ "model.layers.58.self_attn.q_proj.weight": "model-00014-of-00015.safetensors",
502
+ "model.layers.58.self_attn.v_proj.weight": "model-00014-of-00015.safetensors",
503
+ "model.layers.59.input_layernorm.weight": "model-00015-of-00015.safetensors",
504
+ "model.layers.59.mlp.down_proj.weight": "model-00015-of-00015.safetensors",
505
+ "model.layers.59.mlp.gate_proj.weight": "model-00014-of-00015.safetensors",
506
+ "model.layers.59.mlp.up_proj.weight": "model-00014-of-00015.safetensors",
507
+ "model.layers.59.post_attention_layernorm.weight": "model-00015-of-00015.safetensors",
508
+ "model.layers.59.self_attn.k_proj.weight": "model-00014-of-00015.safetensors",
509
+ "model.layers.59.self_attn.o_proj.weight": "model-00014-of-00015.safetensors",
510
+ "model.layers.59.self_attn.q_proj.weight": "model-00014-of-00015.safetensors",
511
+ "model.layers.59.self_attn.v_proj.weight": "model-00014-of-00015.safetensors",
512
+ "model.layers.6.input_layernorm.weight": "model-00002-of-00015.safetensors",
513
+ "model.layers.6.mlp.down_proj.weight": "model-00002-of-00015.safetensors",
514
+ "model.layers.6.mlp.gate_proj.weight": "model-00002-of-00015.safetensors",
515
+ "model.layers.6.mlp.up_proj.weight": "model-00002-of-00015.safetensors",
516
+ "model.layers.6.post_attention_layernorm.weight": "model-00002-of-00015.safetensors",
517
+ "model.layers.6.self_attn.k_proj.weight": "model-00002-of-00015.safetensors",
518
+ "model.layers.6.self_attn.o_proj.weight": "model-00002-of-00015.safetensors",
519
+ "model.layers.6.self_attn.q_proj.weight": "model-00002-of-00015.safetensors",
520
+ "model.layers.6.self_attn.v_proj.weight": "model-00002-of-00015.safetensors",
521
+ "model.layers.7.input_layernorm.weight": "model-00003-of-00015.safetensors",
522
+ "model.layers.7.mlp.down_proj.weight": "model-00003-of-00015.safetensors",
523
+ "model.layers.7.mlp.gate_proj.weight": "model-00002-of-00015.safetensors",
524
+ "model.layers.7.mlp.up_proj.weight": "model-00002-of-00015.safetensors",
525
+ "model.layers.7.post_attention_layernorm.weight": "model-00003-of-00015.safetensors",
526
+ "model.layers.7.self_attn.k_proj.weight": "model-00002-of-00015.safetensors",
527
+ "model.layers.7.self_attn.o_proj.weight": "model-00002-of-00015.safetensors",
528
+ "model.layers.7.self_attn.q_proj.weight": "model-00002-of-00015.safetensors",
529
+ "model.layers.7.self_attn.v_proj.weight": "model-00002-of-00015.safetensors",
530
+ "model.layers.8.input_layernorm.weight": "model-00003-of-00015.safetensors",
531
+ "model.layers.8.mlp.down_proj.weight": "model-00003-of-00015.safetensors",
532
+ "model.layers.8.mlp.gate_proj.weight": "model-00003-of-00015.safetensors",
533
+ "model.layers.8.mlp.up_proj.weight": "model-00003-of-00015.safetensors",
534
+ "model.layers.8.post_attention_layernorm.weight": "model-00003-of-00015.safetensors",
535
+ "model.layers.8.self_attn.k_proj.weight": "model-00003-of-00015.safetensors",
536
+ "model.layers.8.self_attn.o_proj.weight": "model-00003-of-00015.safetensors",
537
+ "model.layers.8.self_attn.q_proj.weight": "model-00003-of-00015.safetensors",
538
+ "model.layers.8.self_attn.v_proj.weight": "model-00003-of-00015.safetensors",
539
+ "model.layers.9.input_layernorm.weight": "model-00003-of-00015.safetensors",
540
+ "model.layers.9.mlp.down_proj.weight": "model-00003-of-00015.safetensors",
541
+ "model.layers.9.mlp.gate_proj.weight": "model-00003-of-00015.safetensors",
542
+ "model.layers.9.mlp.up_proj.weight": "model-00003-of-00015.safetensors",
543
+ "model.layers.9.post_attention_layernorm.weight": "model-00003-of-00015.safetensors",
544
+ "model.layers.9.self_attn.k_proj.weight": "model-00003-of-00015.safetensors",
545
+ "model.layers.9.self_attn.o_proj.weight": "model-00003-of-00015.safetensors",
546
+ "model.layers.9.self_attn.q_proj.weight": "model-00003-of-00015.safetensors",
547
+ "model.layers.9.self_attn.v_proj.weight": "model-00003-of-00015.safetensors",
548
+ "model.norm.weight": "model-00015-of-00015.safetensors"
549
+ }
550
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|startoftext|>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|endoftext|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "<unk>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "unk_token": {
24
+ "content": "<unk>",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ }
30
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:386c49cf943d71aa110361135338c50e38beeff0a66593480421f37b319e1a39
3
+ size 1033105
tokenizer_config.json ADDED
@@ -0,0 +1,43 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": false,
3
+ "add_eos_token": false,
4
+ "added_tokens_decoder": {
5
+ "0": {
6
+ "content": "<unk>",
7
+ "lstrip": false,
8
+ "normalized": true,
9
+ "rstrip": false,
10
+ "single_word": false,
11
+ "special": true
12
+ },
13
+ "1": {
14
+ "content": "<|startoftext|>",
15
+ "lstrip": false,
16
+ "normalized": true,
17
+ "rstrip": false,
18
+ "single_word": false,
19
+ "special": true
20
+ },
21
+ "2": {
22
+ "content": "<|endoftext|>",
23
+ "lstrip": false,
24
+ "normalized": true,
25
+ "rstrip": false,
26
+ "single_word": false,
27
+ "special": true
28
+ }
29
+ },
30
+ "bos_token": "<|startoftext|>",
31
+ "clean_up_tokenization_spaces": false,
32
+ "eos_token": "<|endoftext|>",
33
+ "legacy": false,
34
+ "model_max_length": 200000,
35
+ "pad_token": "<unk>",
36
+ "padding_side": "right",
37
+ "sp_model_kwargs": {},
38
+ "spaces_between_special_tokens": false,
39
+ "tokenizer_class": "LlamaTokenizer",
40
+ "truncation_side": "right",
41
+ "unk_token": "<unk>",
42
+ "use_default_system_prompt": false
43
+ }