Helw150 commited on
Commit
d0c03a8
1 Parent(s): 76cef4e
LICENSE ADDED
@@ -0,0 +1,373 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Mozilla Public License Version 2.0
2
+ ==================================
3
+
4
+ 1. Definitions
5
+ --------------
6
+
7
+ 1.1. "Contributor"
8
+ means each individual or legal entity that creates, contributes to
9
+ the creation of, or owns Covered Software.
10
+
11
+ 1.2. "Contributor Version"
12
+ means the combination of the Contributions of others (if any) used
13
+ by a Contributor and that particular Contributor's Contribution.
14
+
15
+ 1.3. "Contribution"
16
+ means Covered Software of a particular Contributor.
17
+
18
+ 1.4. "Covered Software"
19
+ means Source Code Form to which the initial Contributor has attached
20
+ the notice in Exhibit A, the Executable Form of such Source Code
21
+ Form, and Modifications of such Source Code Form, in each case
22
+ including portions thereof.
23
+
24
+ 1.5. "Incompatible With Secondary Licenses"
25
+ means
26
+
27
+ (a) that the initial Contributor has attached the notice described
28
+ in Exhibit B to the Covered Software; or
29
+
30
+ (b) that the Covered Software was made available under the terms of
31
+ version 1.1 or earlier of the License, but not also under the
32
+ terms of a Secondary License.
33
+
34
+ 1.6. "Executable Form"
35
+ means any form of the work other than Source Code Form.
36
+
37
+ 1.7. "Larger Work"
38
+ means a work that combines Covered Software with other material, in
39
+ a separate file or files, that is not Covered Software.
40
+
41
+ 1.8. "License"
42
+ means this document.
43
+
44
+ 1.9. "Licensable"
45
+ means having the right to grant, to the maximum extent possible,
46
+ whether at the time of the initial grant or subsequently, any and
47
+ all of the rights conveyed by this License.
48
+
49
+ 1.10. "Modifications"
50
+ means any of the following:
51
+
52
+ (a) any file in Source Code Form that results from an addition to,
53
+ deletion from, or modification of the contents of Covered
54
+ Software; or
55
+
56
+ (b) any new file in Source Code Form that contains any Covered
57
+ Software.
58
+
59
+ 1.11. "Patent Claims" of a Contributor
60
+ means any patent claim(s), including without limitation, method,
61
+ process, and apparatus claims, in any patent Licensable by such
62
+ Contributor that would be infringed, but for the grant of the
63
+ License, by the making, using, selling, offering for sale, having
64
+ made, import, or transfer of either its Contributions or its
65
+ Contributor Version.
66
+
67
+ 1.12. "Secondary License"
68
+ means either the GNU General Public License, Version 2.0, the GNU
69
+ Lesser General Public License, Version 2.1, the GNU Affero General
70
+ Public License, Version 3.0, or any later versions of those
71
+ licenses.
72
+
73
+ 1.13. "Source Code Form"
74
+ means the form of the work preferred for making modifications.
75
+
76
+ 1.14. "You" (or "Your")
77
+ means an individual or a legal entity exercising rights under this
78
+ License. For legal entities, "You" includes any entity that
79
+ controls, is controlled by, or is under common control with You. For
80
+ purposes of this definition, "control" means (a) the power, direct
81
+ or indirect, to cause the direction or management of such entity,
82
+ whether by contract or otherwise, or (b) ownership of more than
83
+ fifty percent (50%) of the outstanding shares or beneficial
84
+ ownership of such entity.
85
+
86
+ 2. License Grants and Conditions
87
+ --------------------------------
88
+
89
+ 2.1. Grants
90
+
91
+ Each Contributor hereby grants You a world-wide, royalty-free,
92
+ non-exclusive license:
93
+
94
+ (a) under intellectual property rights (other than patent or trademark)
95
+ Licensable by such Contributor to use, reproduce, make available,
96
+ modify, display, perform, distribute, and otherwise exploit its
97
+ Contributions, either on an unmodified basis, with Modifications, or
98
+ as part of a Larger Work; and
99
+
100
+ (b) under Patent Claims of such Contributor to make, use, sell, offer
101
+ for sale, have made, import, and otherwise transfer either its
102
+ Contributions or its Contributor Version.
103
+
104
+ 2.2. Effective Date
105
+
106
+ The licenses granted in Section 2.1 with respect to any Contribution
107
+ become effective for each Contribution on the date the Contributor first
108
+ distributes such Contribution.
109
+
110
+ 2.3. Limitations on Grant Scope
111
+
112
+ The licenses granted in this Section 2 are the only rights granted under
113
+ this License. No additional rights or licenses will be implied from the
114
+ distribution or licensing of Covered Software under this License.
115
+ Notwithstanding Section 2.1(b) above, no patent license is granted by a
116
+ Contributor:
117
+
118
+ (a) for any code that a Contributor has removed from Covered Software;
119
+ or
120
+
121
+ (b) for infringements caused by: (i) Your and any other third party's
122
+ modifications of Covered Software, or (ii) the combination of its
123
+ Contributions with other software (except as part of its Contributor
124
+ Version); or
125
+
126
+ (c) under Patent Claims infringed by Covered Software in the absence of
127
+ its Contributions.
128
+
129
+ This License does not grant any rights in the trademarks, service marks,
130
+ or logos of any Contributor (except as may be necessary to comply with
131
+ the notice requirements in Section 3.4).
132
+
133
+ 2.4. Subsequent Licenses
134
+
135
+ No Contributor makes additional grants as a result of Your choice to
136
+ distribute the Covered Software under a subsequent version of this
137
+ License (see Section 10.2) or under the terms of a Secondary License (if
138
+ permitted under the terms of Section 3.3).
139
+
140
+ 2.5. Representation
141
+
142
+ Each Contributor represents that the Contributor believes its
143
+ Contributions are its original creation(s) or it has sufficient rights
144
+ to grant the rights to its Contributions conveyed by this License.
145
+
146
+ 2.6. Fair Use
147
+
148
+ This License is not intended to limit any rights You have under
149
+ applicable copyright doctrines of fair use, fair dealing, or other
150
+ equivalents.
151
+
152
+ 2.7. Conditions
153
+
154
+ Sections 3.1, 3.2, 3.3, and 3.4 are conditions of the licenses granted
155
+ in Section 2.1.
156
+
157
+ 3. Responsibilities
158
+ -------------------
159
+
160
+ 3.1. Distribution of Source Form
161
+
162
+ All distribution of Covered Software in Source Code Form, including any
163
+ Modifications that You create or to which You contribute, must be under
164
+ the terms of this License. You must inform recipients that the Source
165
+ Code Form of the Covered Software is governed by the terms of this
166
+ License, and how they can obtain a copy of this License. You may not
167
+ attempt to alter or restrict the recipients' rights in the Source Code
168
+ Form.
169
+
170
+ 3.2. Distribution of Executable Form
171
+
172
+ If You distribute Covered Software in Executable Form then:
173
+
174
+ (a) such Covered Software must also be made available in Source Code
175
+ Form, as described in Section 3.1, and You must inform recipients of
176
+ the Executable Form how they can obtain a copy of such Source Code
177
+ Form by reasonable means in a timely manner, at a charge no more
178
+ than the cost of distribution to the recipient; and
179
+
180
+ (b) You may distribute such Executable Form under the terms of this
181
+ License, or sublicense it under different terms, provided that the
182
+ license for the Executable Form does not attempt to limit or alter
183
+ the recipients' rights in the Source Code Form under this License.
184
+
185
+ 3.3. Distribution of a Larger Work
186
+
187
+ You may create and distribute a Larger Work under terms of Your choice,
188
+ provided that You also comply with the requirements of this License for
189
+ the Covered Software. If the Larger Work is a combination of Covered
190
+ Software with a work governed by one or more Secondary Licenses, and the
191
+ Covered Software is not Incompatible With Secondary Licenses, this
192
+ License permits You to additionally distribute such Covered Software
193
+ under the terms of such Secondary License(s), so that the recipient of
194
+ the Larger Work may, at their option, further distribute the Covered
195
+ Software under the terms of either this License or such Secondary
196
+ License(s).
197
+
198
+ 3.4. Notices
199
+
200
+ You may not remove or alter the substance of any license notices
201
+ (including copyright notices, patent notices, disclaimers of warranty,
202
+ or limitations of liability) contained within the Source Code Form of
203
+ the Covered Software, except that You may alter any license notices to
204
+ the extent required to remedy known factual inaccuracies.
205
+
206
+ 3.5. Application of Additional Terms
207
+
208
+ You may choose to offer, and to charge a fee for, warranty, support,
209
+ indemnity or liability obligations to one or more recipients of Covered
210
+ Software. However, You may do so only on Your own behalf, and not on
211
+ behalf of any Contributor. You must make it absolutely clear that any
212
+ such warranty, support, indemnity, or liability obligation is offered by
213
+ You alone, and You hereby agree to indemnify every Contributor for any
214
+ liability incurred by such Contributor as a result of warranty, support,
215
+ indemnity or liability terms You offer. You may include additional
216
+ disclaimers of warranty and limitations of liability specific to any
217
+ jurisdiction.
218
+
219
+ 4. Inability to Comply Due to Statute or Regulation
220
+ ---------------------------------------------------
221
+
222
+ If it is impossible for You to comply with any of the terms of this
223
+ License with respect to some or all of the Covered Software due to
224
+ statute, judicial order, or regulation then You must: (a) comply with
225
+ the terms of this License to the maximum extent possible; and (b)
226
+ describe the limitations and the code they affect. Such description must
227
+ be placed in a text file included with all distributions of the Covered
228
+ Software under this License. Except to the extent prohibited by statute
229
+ or regulation, such description must be sufficiently detailed for a
230
+ recipient of ordinary skill to be able to understand it.
231
+
232
+ 5. Termination
233
+ --------------
234
+
235
+ 5.1. The rights granted under this License will terminate automatically
236
+ if You fail to comply with any of its terms. However, if You become
237
+ compliant, then the rights granted under this License from a particular
238
+ Contributor are reinstated (a) provisionally, unless and until such
239
+ Contributor explicitly and finally terminates Your grants, and (b) on an
240
+ ongoing basis, if such Contributor fails to notify You of the
241
+ non-compliance by some reasonable means prior to 60 days after You have
242
+ come back into compliance. Moreover, Your grants from a particular
243
+ Contributor are reinstated on an ongoing basis if such Contributor
244
+ notifies You of the non-compliance by some reasonable means, this is the
245
+ first time You have received notice of non-compliance with this License
246
+ from such Contributor, and You become compliant prior to 30 days after
247
+ Your receipt of the notice.
248
+
249
+ 5.2. If You initiate litigation against any entity by asserting a patent
250
+ infringement claim (excluding declaratory judgment actions,
251
+ counter-claims, and cross-claims) alleging that a Contributor Version
252
+ directly or indirectly infringes any patent, then the rights granted to
253
+ You by any and all Contributors for the Covered Software under Section
254
+ 2.1 of this License shall terminate.
255
+
256
+ 5.3. In the event of termination under Sections 5.1 or 5.2 above, all
257
+ end user license agreements (excluding distributors and resellers) which
258
+ have been validly granted by You or Your distributors under this License
259
+ prior to termination shall survive termination.
260
+
261
+ ************************************************************************
262
+ * *
263
+ * 6. Disclaimer of Warranty *
264
+ * ------------------------- *
265
+ * *
266
+ * Covered Software is provided under this License on an "as is" *
267
+ * basis, without warranty of any kind, either expressed, implied, or *
268
+ * statutory, including, without limitation, warranties that the *
269
+ * Covered Software is free of defects, merchantable, fit for a *
270
+ * particular purpose or non-infringing. The entire risk as to the *
271
+ * quality and performance of the Covered Software is with You. *
272
+ * Should any Covered Software prove defective in any respect, You *
273
+ * (not any Contributor) assume the cost of any necessary servicing, *
274
+ * repair, or correction. This disclaimer of warranty constitutes an *
275
+ * essential part of this License. No use of any Covered Software is *
276
+ * authorized under this License except under this disclaimer. *
277
+ * *
278
+ ************************************************************************
279
+
280
+ ************************************************************************
281
+ * *
282
+ * 7. Limitation of Liability *
283
+ * -------------------------- *
284
+ * *
285
+ * Under no circumstances and under no legal theory, whether tort *
286
+ * (including negligence), contract, or otherwise, shall any *
287
+ * Contributor, or anyone who distributes Covered Software as *
288
+ * permitted above, be liable to You for any direct, indirect, *
289
+ * special, incidental, or consequential damages of any character *
290
+ * including, without limitation, damages for lost profits, loss of *
291
+ * goodwill, work stoppage, computer failure or malfunction, or any *
292
+ * and all other commercial damages or losses, even if such party *
293
+ * shall have been informed of the possibility of such damages. This *
294
+ * limitation of liability shall not apply to liability for death or *
295
+ * personal injury resulting from such party's negligence to the *
296
+ * extent applicable law prohibits such limitation. Some *
297
+ * jurisdictions do not allow the exclusion or limitation of *
298
+ * incidental or consequential damages, so this exclusion and *
299
+ * limitation may not apply to You. *
300
+ * *
301
+ ************************************************************************
302
+
303
+ 8. Litigation
304
+ -------------
305
+
306
+ Any litigation relating to this License may be brought only in the
307
+ courts of a jurisdiction where the defendant maintains its principal
308
+ place of business and such litigation shall be governed by laws of that
309
+ jurisdiction, without reference to its conflict-of-law provisions.
310
+ Nothing in this Section shall prevent a party's ability to bring
311
+ cross-claims or counter-claims.
312
+
313
+ 9. Miscellaneous
314
+ ----------------
315
+
316
+ This License represents the complete agreement concerning the subject
317
+ matter hereof. If any provision of this License is held to be
318
+ unenforceable, such provision shall be reformed only to the extent
319
+ necessary to make it enforceable. Any law or regulation which provides
320
+ that the language of a contract shall be construed against the drafter
321
+ shall not be used to construe this License against a Contributor.
322
+
323
+ 10. Versions of the License
324
+ ---------------------------
325
+
326
+ 10.1. New Versions
327
+
328
+ William Held, Diyi Yang, Georgia Tech, and Stanford University are the license stewards. Except as provided in Section
329
+ 10.3, no one other than the license steward has the right to modify or
330
+ publish new versions of this License. Each version will be given a
331
+ distinguishing version number.
332
+
333
+ 10.2. Effect of New Versions
334
+
335
+ You may distribute the Covered Software under the terms of the version
336
+ of the License under which You originally received the Covered Software,
337
+ or under the terms of any subsequent version published by the license
338
+ steward.
339
+
340
+ 10.3. Modified Versions
341
+
342
+ If you create software not governed by this License, and you want to
343
+ create a new license for such software, you may create and use a
344
+ modified version of this License if you rename the license and remove
345
+ any references to the name of the license steward (except to note that
346
+ such modified license differs from this License).
347
+
348
+ 10.4. Distributing Source Code Form that is Incompatible With Secondary
349
+ Licenses
350
+
351
+ If You choose to distribute Source Code Form that is Incompatible With
352
+ Secondary Licenses under the terms of this version of the License, the
353
+ notice described in Exhibit B of this License must be attached.
354
+
355
+ Exhibit A - Source Code Form License Notice
356
+ -------------------------------------------
357
+
358
+ This Source Code Form is subject to the terms of the Mozilla Public
359
+ License, v. 2.0. If a copy of the MPL was not distributed with this
360
+ file, You can obtain one at https://mozilla.org/MPL/2.0/.
361
+
362
+ If it is not possible or desirable to put the notice in a particular
363
+ file, then You may include the notice in a location (such as a LICENSE
364
+ file in a relevant directory) where a recipient would be likely to look
365
+ for such a notice.
366
+
367
+ You may add additional accurate notices of copyright ownership.
368
+
369
+ Exhibit B - "Incompatible With Secondary Licenses" Notice
370
+ ---------------------------------------------------------
371
+
372
+ This Source Code Form is "Incompatible With Secondary Licenses", as
373
+ defined by the Mozilla Public License, v. 2.0.
README.md ADDED
@@ -0,0 +1,76 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Model Card for Diva Llama 3
2
+
3
+ <!-- Provide a quick summary of what the model is/does. [Optional] -->
4
+ This is an end-to-end Voice Assistant Model which can handle speech and text as inputs. It is trained using distillation loss. More details will be in a paper [COMING SOON]!
5
+
6
+ See the model in action compared to SALMONN and Qwen-Audio at [diva-audio.github.io](https://diva-audio.github.io).
7
+ ## Citation
8
+ No Publication As of Yet, But If You Use Please Cite the Below
9
+ **BibTeX:**
10
+
11
+ ```
12
+ @misc{held2024diva,
13
+ author="Held, Will and Zhang, Yanzhe and Ryan, Michael and Shi, Weiyan and Li, Ella and Yang, Diyi",
14
+ title="Distilling an End-to-End Voice Assistant from Speech Recognition Data",
15
+ year="2024",
16
+ publisher="HuggingFace",
17
+ }
18
+
19
+ ```
20
+
21
+ ## Table of Contents
22
+
23
+ - [Model Card for DiVA Llama 3](#model-card-for-DiVA-Llama-3)
24
+ - [Citation](#citation)
25
+ - [Table of Contents](#table-of-contents)
26
+ - [Training Details](#training-details)
27
+ - [Training Data](#training-data)
28
+ - [Training Procedure](#training-procedure)
29
+ - [Environmental Impact](#environmental-impact)
30
+ - [Technical Specifications [optional]](#technical-specifications-optional)
31
+ - [Model Architecture and Objective](#model-architecture-and-objective)
32
+ - [Compute Infrastructure](#compute-infrastructure)
33
+ - [Hardware](#hardware)
34
+ - [Software](#software)
35
+ - [Model Card Contact](#model-card-contact)
36
+
37
+ ## Training Details
38
+
39
+ ### Training Data
40
+
41
+ <!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
42
+
43
+ This model was trained on the [CommonVoice](https://huggingface.co/datasets/mozilla-foundation/common_voice_16_1) corpus.
44
+
45
+
46
+ ### Training Procedure
47
+
48
+ This model was trained for 7k gradient steps with a batch size of 512 Recordings and a linearly decaying learning rate from 5e-5 to zero, with a linear warmup of 70 steps.
49
+
50
+ ### Environmental Impact
51
+
52
+ - **Hardware Type:** V4-32 TPU
53
+ - **Hours used:** 8 Hours
54
+ - **Cloud Provider:** Google Cloud.
55
+ - **Compute Region:** US Central C
56
+
57
+
58
+ ### Hardware
59
+
60
+ This model was trained on at V4 TPU on Google Cloud.
61
+
62
+ ### Software
63
+
64
+ This model was trained with [Levanter](https://github.com/stanford-crfm/levanter)
65
+
66
+
67
+ ## Model Card Authors [optional]
68
+
69
+ <!-- This section provides another layer of transparency and accountability. Whose views is this model card representing? How many voices were included in its construction? Etc. -->
70
+
71
+ Will Held
72
+
73
+ ## Model Card Contact
74
+
75
+ held@stanford.edu
76
+
__init__.py ADDED
File without changes
config.json ADDED
@@ -0,0 +1,138 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "model_type": "diva",
3
+ "architectures": [ "DiVAModel" ],
4
+ "auto_map": {
5
+ "AutoConfig": "configuring_diva.DiVAConfig",
6
+ "AutoModel": "modeling_diva.DiVAModel"
7
+ },
8
+ "vocab_size": 128256,
9
+ "decoder": {
10
+ "architectures": [
11
+ "LlamaForCausalLM"
12
+ ],
13
+ "attention_bias": false,
14
+ "attention_dropout": 0,
15
+ "bos_token_id": 128000,
16
+ "eos_token_id": 128001,
17
+ "hidden_act": "silu",
18
+ "hidden_size": 4096,
19
+ "initializer_range": 0.02,
20
+ "intermediate_size": 14336,
21
+ "max_position_embeddings": 8192,
22
+ "model_type": "llama",
23
+ "num_attention_heads": 32,
24
+ "num_hidden_layers": 32,
25
+ "num_key_value_heads": 8,
26
+ "pretraining_tp": 1,
27
+ "rms_norm_eps": 1e-05,
28
+ "rope_scaling": null,
29
+ "rope_theta": 500000,
30
+ "tie_word_embeddings": false,
31
+ "torch_dtype": "bfloat16",
32
+ "transformers_version": "4.40.0.dev0",
33
+ "use_cache": true,
34
+ "vocab_size": 128256
35
+ },
36
+ "encoder": {
37
+ "_name_or_path": "openai/whisper-large-v3",
38
+ "activation_dropout": 0,
39
+ "activation_function": "gelu",
40
+ "add_cross_attention": false,
41
+ "apply_spec_augment": false,
42
+ "architectures": [
43
+ "WhisperForConditionalGeneration"
44
+ ],
45
+ "attention_dropout": 0,
46
+ "bad_words_ids": null,
47
+ "begin_suppress_tokens": [
48
+ 220,
49
+ 50257
50
+ ],
51
+ "bos_token_id": 50257,
52
+ "chunk_size_feed_forward": 0,
53
+ "classifier_proj_size": 256,
54
+ "cross_attention_hidden_size": null,
55
+ "d_model": 1280,
56
+ "decoder_attention_heads": 20,
57
+ "decoder_ffn_dim": 5120,
58
+ "decoder_layerdrop": 0,
59
+ "decoder_layers": 32,
60
+ "decoder_start_token_id": 50258,
61
+ "diversity_penalty": 0,
62
+ "do_sample": false,
63
+ "dropout": 0,
64
+ "early_stopping": false,
65
+ "encoder_attention_heads": 20,
66
+ "encoder_ffn_dim": 5120,
67
+ "encoder_layerdrop": 0,
68
+ "encoder_layers": 32,
69
+ "encoder_no_repeat_ngram_size": 0,
70
+ "eos_token_id": 50257,
71
+ "exponential_decay_length_penalty": null,
72
+ "finetuning_task": null,
73
+ "forced_bos_token_id": null,
74
+ "forced_eos_token_id": null,
75
+ "id2label": {
76
+ "0": "LABEL_0",
77
+ "1": "LABEL_1"
78
+ },
79
+ "init_std": 0.02,
80
+ "is_decoder": false,
81
+ "is_encoder_decoder": true,
82
+ "label2id": {
83
+ "LABEL_0": 0,
84
+ "LABEL_1": 1
85
+ },
86
+ "length_penalty": 1,
87
+ "mask_feature_length": 10,
88
+ "mask_feature_min_masks": 0,
89
+ "mask_feature_prob": 0,
90
+ "mask_time_length": 10,
91
+ "mask_time_min_masks": 2,
92
+ "mask_time_prob": 0.05,
93
+ "max_length": 448,
94
+ "max_source_positions": 1500,
95
+ "max_target_positions": 448,
96
+ "median_filter_width": 7,
97
+ "min_length": 0,
98
+ "model_type": "whisper",
99
+ "no_repeat_ngram_size": 0,
100
+ "num_beam_groups": 1,
101
+ "num_beams": 1,
102
+ "num_hidden_layers": 32,
103
+ "num_mel_bins": 128,
104
+ "num_return_sequences": 1,
105
+ "output_attentions": false,
106
+ "output_hidden_states": false,
107
+ "output_scores": false,
108
+ "pad_token_id": 50256,
109
+ "prefix": null,
110
+ "problem_type": null,
111
+ "pruned_heads": {},
112
+ "remove_invalid_values": false,
113
+ "repetition_penalty": 1,
114
+ "return_dict": true,
115
+ "return_dict_in_generate": false,
116
+ "scale_embedding": false,
117
+ "sep_token_id": null,
118
+ "suppress_tokens": null,
119
+ "task_specific_params": null,
120
+ "temperature": 1,
121
+ "tf_legacy_loss": false,
122
+ "tie_encoder_decoder": false,
123
+ "tie_word_embeddings": true,
124
+ "tokenizer_class": null,
125
+ "top_k": 50,
126
+ "top_p": 1,
127
+ "torch_dtype": "float16",
128
+ "torchscript": false,
129
+ "transformers_version": "4.38.2",
130
+ "typical_p": 1,
131
+ "use_bfloat16": false,
132
+ "use_cache": true,
133
+ "use_weighted_layer_sum": false,
134
+ "vocab_size": 51866
135
+ },
136
+ "time_dialation": 4,
137
+ "transformers_version": "4.38.2"
138
+ }
configuring_diva.py ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from transformers import PretrainedConfig
2
+
3
+
4
+ class DiVAConfig(PretrainedConfig):
5
+ model_type = "diva"
6
+
7
+ def __init__(
8
+ self,
9
+ **kwargs,
10
+ ):
11
+ super().__init__(**kwargs)
model.safetensors.index.json ADDED
@@ -0,0 +1,782 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 38310608896
4
+ },
5
+ "weight_map": {
6
+ "query_tokens": "model-00001-of-00004.safetensors",
7
+ "projection.weight": "model-00001-of-00004.safetensors",
8
+ "projection.bias": "model-00001-of-00004.safetensors",
9
+ "connector.layers.0.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
10
+ "connector.layers.1.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
11
+ "connector.layers.2.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
12
+ "connector.layers.3.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
13
+ "connector.layers.4.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
14
+ "connector.layers.5.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
15
+ "connector.layers.6.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
16
+ "connector.layers.7.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
17
+ "connector.layers.8.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
18
+ "connector.layers.9.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
19
+ "connector.layers.10.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
20
+ "connector.layers.11.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
21
+ "connector.layers.12.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
22
+ "connector.layers.13.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
23
+ "connector.layers.14.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
24
+ "connector.layers.15.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
25
+ "connector.layers.16.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
26
+ "connector.layers.17.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
27
+ "connector.layers.18.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
28
+ "connector.layers.19.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
29
+ "connector.layers.20.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
30
+ "connector.layers.21.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
31
+ "connector.layers.22.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
32
+ "connector.layers.23.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
33
+ "connector.layers.24.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
34
+ "connector.layers.25.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
35
+ "connector.layers.26.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
36
+ "connector.layers.27.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
37
+ "connector.layers.28.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
38
+ "connector.layers.29.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
39
+ "connector.layers.30.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
40
+ "connector.layers.31.self_attn.q_proj.weight": "model-00001-of-00004.safetensors",
41
+ "connector.layers.0.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
42
+ "connector.layers.1.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
43
+ "connector.layers.2.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
44
+ "connector.layers.3.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
45
+ "connector.layers.4.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
46
+ "connector.layers.5.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
47
+ "connector.layers.6.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
48
+ "connector.layers.7.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
49
+ "connector.layers.8.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
50
+ "connector.layers.9.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
51
+ "connector.layers.10.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
52
+ "connector.layers.11.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
53
+ "connector.layers.12.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
54
+ "connector.layers.13.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
55
+ "connector.layers.14.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
56
+ "connector.layers.15.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
57
+ "connector.layers.16.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
58
+ "connector.layers.17.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
59
+ "connector.layers.18.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
60
+ "connector.layers.19.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
61
+ "connector.layers.20.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
62
+ "connector.layers.21.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
63
+ "connector.layers.22.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
64
+ "connector.layers.23.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
65
+ "connector.layers.24.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
66
+ "connector.layers.25.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
67
+ "connector.layers.26.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
68
+ "connector.layers.27.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
69
+ "connector.layers.28.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
70
+ "connector.layers.29.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
71
+ "connector.layers.30.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
72
+ "connector.layers.31.self_attn.q_proj.bias": "model-00001-of-00004.safetensors",
73
+ "connector.layers.0.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
74
+ "connector.layers.1.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
75
+ "connector.layers.2.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
76
+ "connector.layers.3.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
77
+ "connector.layers.4.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
78
+ "connector.layers.5.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
79
+ "connector.layers.6.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
80
+ "connector.layers.7.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
81
+ "connector.layers.8.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
82
+ "connector.layers.9.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
83
+ "connector.layers.10.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
84
+ "connector.layers.11.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
85
+ "connector.layers.12.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
86
+ "connector.layers.13.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
87
+ "connector.layers.14.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
88
+ "connector.layers.15.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
89
+ "connector.layers.16.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
90
+ "connector.layers.17.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
91
+ "connector.layers.18.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
92
+ "connector.layers.19.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
93
+ "connector.layers.20.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
94
+ "connector.layers.21.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
95
+ "connector.layers.22.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
96
+ "connector.layers.23.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
97
+ "connector.layers.24.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
98
+ "connector.layers.25.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
99
+ "connector.layers.26.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
100
+ "connector.layers.27.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
101
+ "connector.layers.28.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
102
+ "connector.layers.29.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
103
+ "connector.layers.30.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
104
+ "connector.layers.31.self_attn.k_proj.weight": "model-00001-of-00004.safetensors",
105
+ "connector.layers.0.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
106
+ "connector.layers.1.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
107
+ "connector.layers.2.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
108
+ "connector.layers.3.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
109
+ "connector.layers.4.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
110
+ "connector.layers.5.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
111
+ "connector.layers.6.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
112
+ "connector.layers.7.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
113
+ "connector.layers.8.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
114
+ "connector.layers.9.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
115
+ "connector.layers.10.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
116
+ "connector.layers.11.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
117
+ "connector.layers.12.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
118
+ "connector.layers.13.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
119
+ "connector.layers.14.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
120
+ "connector.layers.15.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
121
+ "connector.layers.16.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
122
+ "connector.layers.17.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
123
+ "connector.layers.18.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
124
+ "connector.layers.19.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
125
+ "connector.layers.20.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
126
+ "connector.layers.21.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
127
+ "connector.layers.22.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
128
+ "connector.layers.23.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
129
+ "connector.layers.24.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
130
+ "connector.layers.25.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
131
+ "connector.layers.26.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
132
+ "connector.layers.27.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
133
+ "connector.layers.28.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
134
+ "connector.layers.29.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
135
+ "connector.layers.30.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
136
+ "connector.layers.31.self_attn.v_proj.weight": "model-00001-of-00004.safetensors",
137
+ "connector.layers.0.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
138
+ "connector.layers.1.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
139
+ "connector.layers.2.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
140
+ "connector.layers.3.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
141
+ "connector.layers.4.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
142
+ "connector.layers.5.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
143
+ "connector.layers.6.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
144
+ "connector.layers.7.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
145
+ "connector.layers.8.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
146
+ "connector.layers.9.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
147
+ "connector.layers.10.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
148
+ "connector.layers.11.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
149
+ "connector.layers.12.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
150
+ "connector.layers.13.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
151
+ "connector.layers.14.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
152
+ "connector.layers.15.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
153
+ "connector.layers.16.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
154
+ "connector.layers.17.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
155
+ "connector.layers.18.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
156
+ "connector.layers.19.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
157
+ "connector.layers.20.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
158
+ "connector.layers.21.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
159
+ "connector.layers.22.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
160
+ "connector.layers.23.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
161
+ "connector.layers.24.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
162
+ "connector.layers.25.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
163
+ "connector.layers.26.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
164
+ "connector.layers.27.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
165
+ "connector.layers.28.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
166
+ "connector.layers.29.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
167
+ "connector.layers.30.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
168
+ "connector.layers.31.self_attn.v_proj.bias": "model-00001-of-00004.safetensors",
169
+ "connector.layers.0.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
170
+ "connector.layers.1.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
171
+ "connector.layers.2.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
172
+ "connector.layers.3.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
173
+ "connector.layers.4.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
174
+ "connector.layers.5.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
175
+ "connector.layers.6.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
176
+ "connector.layers.7.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
177
+ "connector.layers.8.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
178
+ "connector.layers.9.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
179
+ "connector.layers.10.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
180
+ "connector.layers.11.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
181
+ "connector.layers.12.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
182
+ "connector.layers.13.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
183
+ "connector.layers.14.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
184
+ "connector.layers.15.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
185
+ "connector.layers.16.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
186
+ "connector.layers.17.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
187
+ "connector.layers.18.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
188
+ "connector.layers.19.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
189
+ "connector.layers.20.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
190
+ "connector.layers.21.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
191
+ "connector.layers.22.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
192
+ "connector.layers.23.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
193
+ "connector.layers.24.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
194
+ "connector.layers.25.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
195
+ "connector.layers.26.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
196
+ "connector.layers.27.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
197
+ "connector.layers.28.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
198
+ "connector.layers.29.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
199
+ "connector.layers.30.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
200
+ "connector.layers.31.self_attn.out_proj.weight": "model-00001-of-00004.safetensors",
201
+ "connector.layers.0.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
202
+ "connector.layers.1.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
203
+ "connector.layers.2.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
204
+ "connector.layers.3.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
205
+ "connector.layers.4.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
206
+ "connector.layers.5.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
207
+ "connector.layers.6.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
208
+ "connector.layers.7.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
209
+ "connector.layers.8.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
210
+ "connector.layers.9.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
211
+ "connector.layers.10.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
212
+ "connector.layers.11.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
213
+ "connector.layers.12.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
214
+ "connector.layers.13.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
215
+ "connector.layers.14.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
216
+ "connector.layers.15.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
217
+ "connector.layers.16.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
218
+ "connector.layers.17.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
219
+ "connector.layers.18.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
220
+ "connector.layers.19.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
221
+ "connector.layers.20.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
222
+ "connector.layers.21.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
223
+ "connector.layers.22.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
224
+ "connector.layers.23.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
225
+ "connector.layers.24.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
226
+ "connector.layers.25.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
227
+ "connector.layers.26.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
228
+ "connector.layers.27.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
229
+ "connector.layers.28.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
230
+ "connector.layers.29.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
231
+ "connector.layers.30.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
232
+ "connector.layers.31.self_attn.out_proj.bias": "model-00001-of-00004.safetensors",
233
+ "connector.layers.0.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
234
+ "connector.layers.1.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
235
+ "connector.layers.2.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
236
+ "connector.layers.3.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
237
+ "connector.layers.4.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
238
+ "connector.layers.5.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
239
+ "connector.layers.6.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
240
+ "connector.layers.7.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
241
+ "connector.layers.8.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
242
+ "connector.layers.9.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
243
+ "connector.layers.10.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
244
+ "connector.layers.11.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
245
+ "connector.layers.12.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
246
+ "connector.layers.13.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
247
+ "connector.layers.14.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
248
+ "connector.layers.15.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
249
+ "connector.layers.16.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
250
+ "connector.layers.17.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
251
+ "connector.layers.18.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
252
+ "connector.layers.19.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
253
+ "connector.layers.20.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
254
+ "connector.layers.21.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
255
+ "connector.layers.22.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
256
+ "connector.layers.23.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
257
+ "connector.layers.24.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
258
+ "connector.layers.25.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
259
+ "connector.layers.26.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
260
+ "connector.layers.27.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
261
+ "connector.layers.28.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
262
+ "connector.layers.29.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
263
+ "connector.layers.30.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
264
+ "connector.layers.31.self_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
265
+ "connector.layers.0.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
266
+ "connector.layers.1.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
267
+ "connector.layers.2.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
268
+ "connector.layers.3.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
269
+ "connector.layers.4.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
270
+ "connector.layers.5.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
271
+ "connector.layers.6.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
272
+ "connector.layers.7.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
273
+ "connector.layers.8.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
274
+ "connector.layers.9.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
275
+ "connector.layers.10.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
276
+ "connector.layers.11.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
277
+ "connector.layers.12.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
278
+ "connector.layers.13.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
279
+ "connector.layers.14.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
280
+ "connector.layers.15.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
281
+ "connector.layers.16.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
282
+ "connector.layers.17.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
283
+ "connector.layers.18.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
284
+ "connector.layers.19.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
285
+ "connector.layers.20.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
286
+ "connector.layers.21.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
287
+ "connector.layers.22.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
288
+ "connector.layers.23.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
289
+ "connector.layers.24.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
290
+ "connector.layers.25.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
291
+ "connector.layers.26.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
292
+ "connector.layers.27.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
293
+ "connector.layers.28.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
294
+ "connector.layers.29.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
295
+ "connector.layers.30.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
296
+ "connector.layers.31.self_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
297
+ "connector.layers.0.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
298
+ "connector.layers.1.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
299
+ "connector.layers.2.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
300
+ "connector.layers.3.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
301
+ "connector.layers.4.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
302
+ "connector.layers.5.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
303
+ "connector.layers.6.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
304
+ "connector.layers.7.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
305
+ "connector.layers.8.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
306
+ "connector.layers.9.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
307
+ "connector.layers.10.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
308
+ "connector.layers.11.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
309
+ "connector.layers.12.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
310
+ "connector.layers.13.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
311
+ "connector.layers.14.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
312
+ "connector.layers.15.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
313
+ "connector.layers.16.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
314
+ "connector.layers.17.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
315
+ "connector.layers.18.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
316
+ "connector.layers.19.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
317
+ "connector.layers.20.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
318
+ "connector.layers.21.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
319
+ "connector.layers.22.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
320
+ "connector.layers.23.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
321
+ "connector.layers.24.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
322
+ "connector.layers.25.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
323
+ "connector.layers.26.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
324
+ "connector.layers.27.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
325
+ "connector.layers.28.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
326
+ "connector.layers.29.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
327
+ "connector.layers.30.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
328
+ "connector.layers.31.encoder_attn.q_proj.weight": "model-00001-of-00004.safetensors",
329
+ "connector.layers.0.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
330
+ "connector.layers.1.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
331
+ "connector.layers.2.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
332
+ "connector.layers.3.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
333
+ "connector.layers.4.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
334
+ "connector.layers.5.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
335
+ "connector.layers.6.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
336
+ "connector.layers.7.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
337
+ "connector.layers.8.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
338
+ "connector.layers.9.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
339
+ "connector.layers.10.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
340
+ "connector.layers.11.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
341
+ "connector.layers.12.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
342
+ "connector.layers.13.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
343
+ "connector.layers.14.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
344
+ "connector.layers.15.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
345
+ "connector.layers.16.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
346
+ "connector.layers.17.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
347
+ "connector.layers.18.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
348
+ "connector.layers.19.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
349
+ "connector.layers.20.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
350
+ "connector.layers.21.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
351
+ "connector.layers.22.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
352
+ "connector.layers.23.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
353
+ "connector.layers.24.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
354
+ "connector.layers.25.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
355
+ "connector.layers.26.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
356
+ "connector.layers.27.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
357
+ "connector.layers.28.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
358
+ "connector.layers.29.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
359
+ "connector.layers.30.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
360
+ "connector.layers.31.encoder_attn.q_proj.bias": "model-00001-of-00004.safetensors",
361
+ "connector.layers.0.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
362
+ "connector.layers.1.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
363
+ "connector.layers.2.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
364
+ "connector.layers.3.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
365
+ "connector.layers.4.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
366
+ "connector.layers.5.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
367
+ "connector.layers.6.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
368
+ "connector.layers.7.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
369
+ "connector.layers.8.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
370
+ "connector.layers.9.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
371
+ "connector.layers.10.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
372
+ "connector.layers.11.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
373
+ "connector.layers.12.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
374
+ "connector.layers.13.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
375
+ "connector.layers.14.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
376
+ "connector.layers.15.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
377
+ "connector.layers.16.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
378
+ "connector.layers.17.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
379
+ "connector.layers.18.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
380
+ "connector.layers.19.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
381
+ "connector.layers.20.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
382
+ "connector.layers.21.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
383
+ "connector.layers.22.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
384
+ "connector.layers.23.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
385
+ "connector.layers.24.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
386
+ "connector.layers.25.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
387
+ "connector.layers.26.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
388
+ "connector.layers.27.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
389
+ "connector.layers.28.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
390
+ "connector.layers.29.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
391
+ "connector.layers.30.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
392
+ "connector.layers.31.encoder_attn.k_proj.weight": "model-00001-of-00004.safetensors",
393
+ "connector.layers.0.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
394
+ "connector.layers.1.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
395
+ "connector.layers.2.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
396
+ "connector.layers.3.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
397
+ "connector.layers.4.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
398
+ "connector.layers.5.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
399
+ "connector.layers.6.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
400
+ "connector.layers.7.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
401
+ "connector.layers.8.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
402
+ "connector.layers.9.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
403
+ "connector.layers.10.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
404
+ "connector.layers.11.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
405
+ "connector.layers.12.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
406
+ "connector.layers.13.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
407
+ "connector.layers.14.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
408
+ "connector.layers.15.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
409
+ "connector.layers.16.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
410
+ "connector.layers.17.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
411
+ "connector.layers.18.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
412
+ "connector.layers.19.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
413
+ "connector.layers.20.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
414
+ "connector.layers.21.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
415
+ "connector.layers.22.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
416
+ "connector.layers.23.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
417
+ "connector.layers.24.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
418
+ "connector.layers.25.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
419
+ "connector.layers.26.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
420
+ "connector.layers.27.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
421
+ "connector.layers.28.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
422
+ "connector.layers.29.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
423
+ "connector.layers.30.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
424
+ "connector.layers.31.encoder_attn.v_proj.weight": "model-00001-of-00004.safetensors",
425
+ "connector.layers.0.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
426
+ "connector.layers.1.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
427
+ "connector.layers.2.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
428
+ "connector.layers.3.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
429
+ "connector.layers.4.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
430
+ "connector.layers.5.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
431
+ "connector.layers.6.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
432
+ "connector.layers.7.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
433
+ "connector.layers.8.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
434
+ "connector.layers.9.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
435
+ "connector.layers.10.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
436
+ "connector.layers.11.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
437
+ "connector.layers.12.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
438
+ "connector.layers.13.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
439
+ "connector.layers.14.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
440
+ "connector.layers.15.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
441
+ "connector.layers.16.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
442
+ "connector.layers.17.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
443
+ "connector.layers.18.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
444
+ "connector.layers.19.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
445
+ "connector.layers.20.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
446
+ "connector.layers.21.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
447
+ "connector.layers.22.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
448
+ "connector.layers.23.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
449
+ "connector.layers.24.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
450
+ "connector.layers.25.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
451
+ "connector.layers.26.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
452
+ "connector.layers.27.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
453
+ "connector.layers.28.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
454
+ "connector.layers.29.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
455
+ "connector.layers.30.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
456
+ "connector.layers.31.encoder_attn.v_proj.bias": "model-00001-of-00004.safetensors",
457
+ "connector.layers.0.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
458
+ "connector.layers.1.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
459
+ "connector.layers.2.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
460
+ "connector.layers.3.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
461
+ "connector.layers.4.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
462
+ "connector.layers.5.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
463
+ "connector.layers.6.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
464
+ "connector.layers.7.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
465
+ "connector.layers.8.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
466
+ "connector.layers.9.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
467
+ "connector.layers.10.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
468
+ "connector.layers.11.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
469
+ "connector.layers.12.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
470
+ "connector.layers.13.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
471
+ "connector.layers.14.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
472
+ "connector.layers.15.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
473
+ "connector.layers.16.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
474
+ "connector.layers.17.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
475
+ "connector.layers.18.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
476
+ "connector.layers.19.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
477
+ "connector.layers.20.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
478
+ "connector.layers.21.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
479
+ "connector.layers.22.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
480
+ "connector.layers.23.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
481
+ "connector.layers.24.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
482
+ "connector.layers.25.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
483
+ "connector.layers.26.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
484
+ "connector.layers.27.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
485
+ "connector.layers.28.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
486
+ "connector.layers.29.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
487
+ "connector.layers.30.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
488
+ "connector.layers.31.encoder_attn.out_proj.weight": "model-00001-of-00004.safetensors",
489
+ "connector.layers.0.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
490
+ "connector.layers.1.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
491
+ "connector.layers.2.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
492
+ "connector.layers.3.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
493
+ "connector.layers.4.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
494
+ "connector.layers.5.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
495
+ "connector.layers.6.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
496
+ "connector.layers.7.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
497
+ "connector.layers.8.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
498
+ "connector.layers.9.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
499
+ "connector.layers.10.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
500
+ "connector.layers.11.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
501
+ "connector.layers.12.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
502
+ "connector.layers.13.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
503
+ "connector.layers.14.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
504
+ "connector.layers.15.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
505
+ "connector.layers.16.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
506
+ "connector.layers.17.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
507
+ "connector.layers.18.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
508
+ "connector.layers.19.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
509
+ "connector.layers.20.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
510
+ "connector.layers.21.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
511
+ "connector.layers.22.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
512
+ "connector.layers.23.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
513
+ "connector.layers.24.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
514
+ "connector.layers.25.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
515
+ "connector.layers.26.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
516
+ "connector.layers.27.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
517
+ "connector.layers.28.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
518
+ "connector.layers.29.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
519
+ "connector.layers.30.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
520
+ "connector.layers.31.encoder_attn.out_proj.bias": "model-00001-of-00004.safetensors",
521
+ "connector.layers.0.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
522
+ "connector.layers.1.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
523
+ "connector.layers.2.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
524
+ "connector.layers.3.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
525
+ "connector.layers.4.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
526
+ "connector.layers.5.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
527
+ "connector.layers.6.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
528
+ "connector.layers.7.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
529
+ "connector.layers.8.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
530
+ "connector.layers.9.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
531
+ "connector.layers.10.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
532
+ "connector.layers.11.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
533
+ "connector.layers.12.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
534
+ "connector.layers.13.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
535
+ "connector.layers.14.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
536
+ "connector.layers.15.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
537
+ "connector.layers.16.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
538
+ "connector.layers.17.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
539
+ "connector.layers.18.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
540
+ "connector.layers.19.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
541
+ "connector.layers.20.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
542
+ "connector.layers.21.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
543
+ "connector.layers.22.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
544
+ "connector.layers.23.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
545
+ "connector.layers.24.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
546
+ "connector.layers.25.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
547
+ "connector.layers.26.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
548
+ "connector.layers.27.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
549
+ "connector.layers.28.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
550
+ "connector.layers.29.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
551
+ "connector.layers.30.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
552
+ "connector.layers.31.encoder_attn_layer_norm.weight": "model-00001-of-00004.safetensors",
553
+ "connector.layers.0.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
554
+ "connector.layers.1.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
555
+ "connector.layers.2.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
556
+ "connector.layers.3.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
557
+ "connector.layers.4.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
558
+ "connector.layers.5.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
559
+ "connector.layers.6.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
560
+ "connector.layers.7.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
561
+ "connector.layers.8.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
562
+ "connector.layers.9.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
563
+ "connector.layers.10.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
564
+ "connector.layers.11.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
565
+ "connector.layers.12.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
566
+ "connector.layers.13.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
567
+ "connector.layers.14.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
568
+ "connector.layers.15.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
569
+ "connector.layers.16.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
570
+ "connector.layers.17.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
571
+ "connector.layers.18.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
572
+ "connector.layers.19.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
573
+ "connector.layers.20.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
574
+ "connector.layers.21.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
575
+ "connector.layers.22.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
576
+ "connector.layers.23.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
577
+ "connector.layers.24.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
578
+ "connector.layers.25.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
579
+ "connector.layers.26.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
580
+ "connector.layers.27.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
581
+ "connector.layers.28.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
582
+ "connector.layers.29.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
583
+ "connector.layers.30.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
584
+ "connector.layers.31.encoder_attn_layer_norm.bias": "model-00001-of-00004.safetensors",
585
+ "connector.layers.0.fc1.weight": "model-00001-of-00004.safetensors",
586
+ "connector.layers.1.fc1.weight": "model-00001-of-00004.safetensors",
587
+ "connector.layers.2.fc1.weight": "model-00001-of-00004.safetensors",
588
+ "connector.layers.3.fc1.weight": "model-00001-of-00004.safetensors",
589
+ "connector.layers.4.fc1.weight": "model-00001-of-00004.safetensors",
590
+ "connector.layers.5.fc1.weight": "model-00001-of-00004.safetensors",
591
+ "connector.layers.6.fc1.weight": "model-00001-of-00004.safetensors",
592
+ "connector.layers.7.fc1.weight": "model-00001-of-00004.safetensors",
593
+ "connector.layers.8.fc1.weight": "model-00001-of-00004.safetensors",
594
+ "connector.layers.9.fc1.weight": "model-00001-of-00004.safetensors",
595
+ "connector.layers.10.fc1.weight": "model-00001-of-00004.safetensors",
596
+ "connector.layers.11.fc1.weight": "model-00001-of-00004.safetensors",
597
+ "connector.layers.12.fc1.weight": "model-00001-of-00004.safetensors",
598
+ "connector.layers.13.fc1.weight": "model-00001-of-00004.safetensors",
599
+ "connector.layers.14.fc1.weight": "model-00001-of-00004.safetensors",
600
+ "connector.layers.15.fc1.weight": "model-00001-of-00004.safetensors",
601
+ "connector.layers.16.fc1.weight": "model-00001-of-00004.safetensors",
602
+ "connector.layers.17.fc1.weight": "model-00001-of-00004.safetensors",
603
+ "connector.layers.18.fc1.weight": "model-00001-of-00004.safetensors",
604
+ "connector.layers.19.fc1.weight": "model-00001-of-00004.safetensors",
605
+ "connector.layers.20.fc1.weight": "model-00001-of-00004.safetensors",
606
+ "connector.layers.21.fc1.weight": "model-00001-of-00004.safetensors",
607
+ "connector.layers.22.fc1.weight": "model-00001-of-00004.safetensors",
608
+ "connector.layers.23.fc1.weight": "model-00001-of-00004.safetensors",
609
+ "connector.layers.24.fc1.weight": "model-00001-of-00004.safetensors",
610
+ "connector.layers.25.fc1.weight": "model-00001-of-00004.safetensors",
611
+ "connector.layers.26.fc1.weight": "model-00001-of-00004.safetensors",
612
+ "connector.layers.27.fc1.weight": "model-00001-of-00004.safetensors",
613
+ "connector.layers.28.fc1.weight": "model-00001-of-00004.safetensors",
614
+ "connector.layers.29.fc1.weight": "model-00001-of-00004.safetensors",
615
+ "connector.layers.30.fc1.weight": "model-00001-of-00004.safetensors",
616
+ "connector.layers.31.fc1.weight": "model-00001-of-00004.safetensors",
617
+ "connector.layers.0.fc1.bias": "model-00001-of-00004.safetensors",
618
+ "connector.layers.1.fc1.bias": "model-00001-of-00004.safetensors",
619
+ "connector.layers.2.fc1.bias": "model-00001-of-00004.safetensors",
620
+ "connector.layers.3.fc1.bias": "model-00001-of-00004.safetensors",
621
+ "connector.layers.4.fc1.bias": "model-00001-of-00004.safetensors",
622
+ "connector.layers.5.fc1.bias": "model-00001-of-00004.safetensors",
623
+ "connector.layers.6.fc1.bias": "model-00001-of-00004.safetensors",
624
+ "connector.layers.7.fc1.bias": "model-00001-of-00004.safetensors",
625
+ "connector.layers.8.fc1.bias": "model-00001-of-00004.safetensors",
626
+ "connector.layers.9.fc1.bias": "model-00001-of-00004.safetensors",
627
+ "connector.layers.10.fc1.bias": "model-00001-of-00004.safetensors",
628
+ "connector.layers.11.fc1.bias": "model-00001-of-00004.safetensors",
629
+ "connector.layers.12.fc1.bias": "model-00001-of-00004.safetensors",
630
+ "connector.layers.13.fc1.bias": "model-00001-of-00004.safetensors",
631
+ "connector.layers.14.fc1.bias": "model-00001-of-00004.safetensors",
632
+ "connector.layers.15.fc1.bias": "model-00001-of-00004.safetensors",
633
+ "connector.layers.16.fc1.bias": "model-00001-of-00004.safetensors",
634
+ "connector.layers.17.fc1.bias": "model-00001-of-00004.safetensors",
635
+ "connector.layers.18.fc1.bias": "model-00001-of-00004.safetensors",
636
+ "connector.layers.19.fc1.bias": "model-00001-of-00004.safetensors",
637
+ "connector.layers.20.fc1.bias": "model-00001-of-00004.safetensors",
638
+ "connector.layers.21.fc1.bias": "model-00001-of-00004.safetensors",
639
+ "connector.layers.22.fc1.bias": "model-00001-of-00004.safetensors",
640
+ "connector.layers.23.fc1.bias": "model-00001-of-00004.safetensors",
641
+ "connector.layers.24.fc1.bias": "model-00001-of-00004.safetensors",
642
+ "connector.layers.25.fc1.bias": "model-00001-of-00004.safetensors",
643
+ "connector.layers.26.fc1.bias": "model-00001-of-00004.safetensors",
644
+ "connector.layers.27.fc1.bias": "model-00001-of-00004.safetensors",
645
+ "connector.layers.28.fc1.bias": "model-00001-of-00004.safetensors",
646
+ "connector.layers.29.fc1.bias": "model-00001-of-00004.safetensors",
647
+ "connector.layers.30.fc1.bias": "model-00001-of-00004.safetensors",
648
+ "connector.layers.31.fc1.bias": "model-00001-of-00004.safetensors",
649
+ "connector.layers.0.fc2.weight": "model-00001-of-00004.safetensors",
650
+ "connector.layers.1.fc2.weight": "model-00001-of-00004.safetensors",
651
+ "connector.layers.2.fc2.weight": "model-00001-of-00004.safetensors",
652
+ "connector.layers.3.fc2.weight": "model-00001-of-00004.safetensors",
653
+ "connector.layers.4.fc2.weight": "model-00001-of-00004.safetensors",
654
+ "connector.layers.5.fc2.weight": "model-00001-of-00004.safetensors",
655
+ "connector.layers.6.fc2.weight": "model-00001-of-00004.safetensors",
656
+ "connector.layers.7.fc2.weight": "model-00001-of-00004.safetensors",
657
+ "connector.layers.8.fc2.weight": "model-00001-of-00004.safetensors",
658
+ "connector.layers.9.fc2.weight": "model-00001-of-00004.safetensors",
659
+ "connector.layers.10.fc2.weight": "model-00001-of-00004.safetensors",
660
+ "connector.layers.11.fc2.weight": "model-00001-of-00004.safetensors",
661
+ "connector.layers.12.fc2.weight": "model-00001-of-00004.safetensors",
662
+ "connector.layers.13.fc2.weight": "model-00001-of-00004.safetensors",
663
+ "connector.layers.14.fc2.weight": "model-00001-of-00004.safetensors",
664
+ "connector.layers.15.fc2.weight": "model-00001-of-00004.safetensors",
665
+ "connector.layers.16.fc2.weight": "model-00001-of-00004.safetensors",
666
+ "connector.layers.17.fc2.weight": "model-00001-of-00004.safetensors",
667
+ "connector.layers.18.fc2.weight": "model-00001-of-00004.safetensors",
668
+ "connector.layers.19.fc2.weight": "model-00001-of-00004.safetensors",
669
+ "connector.layers.20.fc2.weight": "model-00001-of-00004.safetensors",
670
+ "connector.layers.21.fc2.weight": "model-00001-of-00004.safetensors",
671
+ "connector.layers.22.fc2.weight": "model-00001-of-00004.safetensors",
672
+ "connector.layers.23.fc2.weight": "model-00001-of-00004.safetensors",
673
+ "connector.layers.24.fc2.weight": "model-00001-of-00004.safetensors",
674
+ "connector.layers.25.fc2.weight": "model-00001-of-00004.safetensors",
675
+ "connector.layers.26.fc2.weight": "model-00001-of-00004.safetensors",
676
+ "connector.layers.27.fc2.weight": "model-00001-of-00004.safetensors",
677
+ "connector.layers.28.fc2.weight": "model-00001-of-00004.safetensors",
678
+ "connector.layers.29.fc2.weight": "model-00001-of-00004.safetensors",
679
+ "connector.layers.30.fc2.weight": "model-00001-of-00004.safetensors",
680
+ "connector.layers.31.fc2.weight": "model-00001-of-00004.safetensors",
681
+ "connector.layers.0.fc2.bias": "model-00001-of-00004.safetensors",
682
+ "connector.layers.1.fc2.bias": "model-00001-of-00004.safetensors",
683
+ "connector.layers.2.fc2.bias": "model-00001-of-00004.safetensors",
684
+ "connector.layers.3.fc2.bias": "model-00001-of-00004.safetensors",
685
+ "connector.layers.4.fc2.bias": "model-00001-of-00004.safetensors",
686
+ "connector.layers.5.fc2.bias": "model-00001-of-00004.safetensors",
687
+ "connector.layers.6.fc2.bias": "model-00001-of-00004.safetensors",
688
+ "connector.layers.7.fc2.bias": "model-00001-of-00004.safetensors",
689
+ "connector.layers.8.fc2.bias": "model-00001-of-00004.safetensors",
690
+ "connector.layers.9.fc2.bias": "model-00001-of-00004.safetensors",
691
+ "connector.layers.10.fc2.bias": "model-00001-of-00004.safetensors",
692
+ "connector.layers.11.fc2.bias": "model-00001-of-00004.safetensors",
693
+ "connector.layers.12.fc2.bias": "model-00001-of-00004.safetensors",
694
+ "connector.layers.13.fc2.bias": "model-00001-of-00004.safetensors",
695
+ "connector.layers.14.fc2.bias": "model-00001-of-00004.safetensors",
696
+ "connector.layers.15.fc2.bias": "model-00001-of-00004.safetensors",
697
+ "connector.layers.16.fc2.bias": "model-00001-of-00004.safetensors",
698
+ "connector.layers.17.fc2.bias": "model-00001-of-00004.safetensors",
699
+ "connector.layers.18.fc2.bias": "model-00001-of-00004.safetensors",
700
+ "connector.layers.19.fc2.bias": "model-00001-of-00004.safetensors",
701
+ "connector.layers.20.fc2.bias": "model-00001-of-00004.safetensors",
702
+ "connector.layers.21.fc2.bias": "model-00001-of-00004.safetensors",
703
+ "connector.layers.22.fc2.bias": "model-00001-of-00004.safetensors",
704
+ "connector.layers.23.fc2.bias": "model-00001-of-00004.safetensors",
705
+ "connector.layers.24.fc2.bias": "model-00001-of-00004.safetensors",
706
+ "connector.layers.25.fc2.bias": "model-00001-of-00004.safetensors",
707
+ "connector.layers.26.fc2.bias": "model-00001-of-00004.safetensors",
708
+ "connector.layers.27.fc2.bias": "model-00001-of-00004.safetensors",
709
+ "connector.layers.28.fc2.bias": "model-00001-of-00004.safetensors",
710
+ "connector.layers.29.fc2.bias": "model-00001-of-00004.safetensors",
711
+ "connector.layers.30.fc2.bias": "model-00001-of-00004.safetensors",
712
+ "connector.layers.31.fc2.bias": "model-00001-of-00004.safetensors",
713
+ "connector.layers.0.final_layer_norm.weight": "model-00001-of-00004.safetensors",
714
+ "connector.layers.1.final_layer_norm.weight": "model-00001-of-00004.safetensors",
715
+ "connector.layers.2.final_layer_norm.weight": "model-00001-of-00004.safetensors",
716
+ "connector.layers.3.final_layer_norm.weight": "model-00001-of-00004.safetensors",
717
+ "connector.layers.4.final_layer_norm.weight": "model-00001-of-00004.safetensors",
718
+ "connector.layers.5.final_layer_norm.weight": "model-00001-of-00004.safetensors",
719
+ "connector.layers.6.final_layer_norm.weight": "model-00001-of-00004.safetensors",
720
+ "connector.layers.7.final_layer_norm.weight": "model-00001-of-00004.safetensors",
721
+ "connector.layers.8.final_layer_norm.weight": "model-00001-of-00004.safetensors",
722
+ "connector.layers.9.final_layer_norm.weight": "model-00001-of-00004.safetensors",
723
+ "connector.layers.10.final_layer_norm.weight": "model-00001-of-00004.safetensors",
724
+ "connector.layers.11.final_layer_norm.weight": "model-00001-of-00004.safetensors",
725
+ "connector.layers.12.final_layer_norm.weight": "model-00001-of-00004.safetensors",
726
+ "connector.layers.13.final_layer_norm.weight": "model-00001-of-00004.safetensors",
727
+ "connector.layers.14.final_layer_norm.weight": "model-00001-of-00004.safetensors",
728
+ "connector.layers.15.final_layer_norm.weight": "model-00001-of-00004.safetensors",
729
+ "connector.layers.16.final_layer_norm.weight": "model-00001-of-00004.safetensors",
730
+ "connector.layers.17.final_layer_norm.weight": "model-00001-of-00004.safetensors",
731
+ "connector.layers.18.final_layer_norm.weight": "model-00001-of-00004.safetensors",
732
+ "connector.layers.19.final_layer_norm.weight": "model-00001-of-00004.safetensors",
733
+ "connector.layers.20.final_layer_norm.weight": "model-00001-of-00004.safetensors",
734
+ "connector.layers.21.final_layer_norm.weight": "model-00001-of-00004.safetensors",
735
+ "connector.layers.22.final_layer_norm.weight": "model-00001-of-00004.safetensors",
736
+ "connector.layers.23.final_layer_norm.weight": "model-00001-of-00004.safetensors",
737
+ "connector.layers.24.final_layer_norm.weight": "model-00001-of-00004.safetensors",
738
+ "connector.layers.25.final_layer_norm.weight": "model-00001-of-00004.safetensors",
739
+ "connector.layers.26.final_layer_norm.weight": "model-00001-of-00004.safetensors",
740
+ "connector.layers.27.final_layer_norm.weight": "model-00001-of-00004.safetensors",
741
+ "connector.layers.28.final_layer_norm.weight": "model-00001-of-00004.safetensors",
742
+ "connector.layers.29.final_layer_norm.weight": "model-00001-of-00004.safetensors",
743
+ "connector.layers.30.final_layer_norm.weight": "model-00001-of-00004.safetensors",
744
+ "connector.layers.31.final_layer_norm.weight": "model-00001-of-00004.safetensors",
745
+ "connector.layers.0.final_layer_norm.bias": "model-00001-of-00004.safetensors",
746
+ "connector.layers.1.final_layer_norm.bias": "model-00001-of-00004.safetensors",
747
+ "connector.layers.2.final_layer_norm.bias": "model-00001-of-00004.safetensors",
748
+ "connector.layers.3.final_layer_norm.bias": "model-00001-of-00004.safetensors",
749
+ "connector.layers.4.final_layer_norm.bias": "model-00001-of-00004.safetensors",
750
+ "connector.layers.5.final_layer_norm.bias": "model-00001-of-00004.safetensors",
751
+ "connector.layers.6.final_layer_norm.bias": "model-00001-of-00004.safetensors",
752
+ "connector.layers.7.final_layer_norm.bias": "model-00001-of-00004.safetensors",
753
+ "connector.layers.8.final_layer_norm.bias": "model-00001-of-00004.safetensors",
754
+ "connector.layers.9.final_layer_norm.bias": "model-00001-of-00004.safetensors",
755
+ "connector.layers.10.final_layer_norm.bias": "model-00001-of-00004.safetensors",
756
+ "connector.layers.11.final_layer_norm.bias": "model-00001-of-00004.safetensors",
757
+ "connector.layers.12.final_layer_norm.bias": "model-00001-of-00004.safetensors",
758
+ "connector.layers.13.final_layer_norm.bias": "model-00001-of-00004.safetensors",
759
+ "connector.layers.14.final_layer_norm.bias": "model-00001-of-00004.safetensors",
760
+ "connector.layers.15.final_layer_norm.bias": "model-00001-of-00004.safetensors",
761
+ "connector.layers.16.final_layer_norm.bias": "model-00001-of-00004.safetensors",
762
+ "connector.layers.17.final_layer_norm.bias": "model-00001-of-00004.safetensors",
763
+ "connector.layers.18.final_layer_norm.bias": "model-00001-of-00004.safetensors",
764
+ "connector.layers.19.final_layer_norm.bias": "model-00001-of-00004.safetensors",
765
+ "connector.layers.20.final_layer_norm.bias": "model-00001-of-00004.safetensors",
766
+ "connector.layers.21.final_layer_norm.bias": "model-00001-of-00004.safetensors",
767
+ "connector.layers.22.final_layer_norm.bias": "model-00001-of-00004.safetensors",
768
+ "connector.layers.23.final_layer_norm.bias": "model-00001-of-00004.safetensors",
769
+ "connector.layers.24.final_layer_norm.bias": "model-00001-of-00004.safetensors",
770
+ "connector.layers.25.final_layer_norm.bias": "model-00001-of-00004.safetensors",
771
+ "connector.layers.26.final_layer_norm.bias": "model-00001-of-00004.safetensors",
772
+ "connector.layers.27.final_layer_norm.bias": "model-00001-of-00004.safetensors",
773
+ "connector.layers.28.final_layer_norm.bias": "model-00001-of-00004.safetensors",
774
+ "connector.layers.29.final_layer_norm.bias": "model-00001-of-00004.safetensors",
775
+ "connector.layers.30.final_layer_norm.bias": "model-00001-of-00004.safetensors",
776
+ "connector.layers.31.final_layer_norm.bias": "model-00001-of-00004.safetensors",
777
+ "connector.layer_norm.weight": "model-00001-of-00004.safetensors",
778
+ "connector.layer_norm.bias": "model-00001-of-00004.safetensors",
779
+ "connector.embed_tokens.weight": "model-00001-of-00004.safetensors",
780
+ "connector.embed_positions.weight": "model-00001-of-00004.safetensors"
781
+ }
782
+ }
modeling_diva.py ADDED
@@ -0,0 +1,308 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import copy
2
+ import json
3
+ import os
4
+ from typing import Optional, Union
5
+
6
+ import librosa
7
+ import numpy as np
8
+ import torch
9
+ import torch.nn.functional as F
10
+ from datasets import Audio
11
+ from safetensors.torch import load, load_model
12
+ from torch import nn
13
+ from .configuring_diva import DiVAConfig
14
+ from transformers import (
15
+ AutoProcessor,
16
+ AutoTokenizer,
17
+ LlamaForCausalLM,
18
+ PreTrainedModel,
19
+ WhisperForConditionalGeneration,
20
+ )
21
+
22
+
23
+ class WhisperConnector(nn.Module):
24
+ def __init__(
25
+ self,
26
+ ):
27
+ super().__init__()
28
+ self.decoder = None
29
+ self.projection = nn.Linear(1280, 4096)
30
+ self.query_tokens = nn.Parameter(torch.randn(448, 1280))
31
+
32
+ def forward(self, x, output_device="cuda:1"):
33
+ bsz = x.shape[0]
34
+ query_tokens = self.query_tokens[None, :, :].expand(bsz, -1, -1)
35
+ virt_whisper_tokens = self.decoder(
36
+ inputs_embeds=query_tokens, encoder_hidden_states=x
37
+ )
38
+ if self.projection.weight.shape[-1] == 5120:
39
+ virtual_tokens = self.projection(virt_whisper_tokens[0].reshape(112, 5120))
40
+ else:
41
+ virtual_tokens = self.projection(virt_whisper_tokens[0])
42
+ return virtual_tokens.to(output_device)
43
+
44
+
45
+ class DiVAModel(PreTrainedModel):
46
+ config_class = DiVAConfig
47
+
48
+ def __init__(
49
+ self, via_path=None, config_dict={}, device_map=None, speech_encoder_device=None
50
+ ):
51
+ super().__init__(DiVAConfig.from_dict(config_dict))
52
+ if speech_encoder_device is None:
53
+ speech_encoder_device = "cuda:0"
54
+ whisper = WhisperForConditionalGeneration.from_pretrained(
55
+ "openai/whisper-large-v3"
56
+ )
57
+ connector = WhisperConnector()
58
+ connector.decoder = copy.deepcopy(whisper.model.decoder)
59
+ if via_path is not None:
60
+ with open(via_path, "rb") as f:
61
+ sd = load(f.read())
62
+
63
+ with torch.no_grad():
64
+ connector.query_tokens = nn.Parameter(sd["query_tokens"])
65
+ connector.projection.weight = nn.Parameter(sd["projection.weight"].T)
66
+ connector.projection.bias = nn.Parameter(sd["projection.bias"])
67
+ wsd = {
68
+ key.replace("connector.", ""): sd[key]
69
+ for key in sd
70
+ if key.startswith("connector.")
71
+ }
72
+ connector.decoder.load_state_dict(wsd)
73
+
74
+ if device_map == None:
75
+ num_layers = 32
76
+ num_gpus = 2
77
+ device_map = dict(
78
+ **{"model.embed_tokens": 1, "model.norm": 1, "lm_head": 2},
79
+ **{
80
+ "model.layers." + str(i): 1 + (i // (num_layers // num_gpus))
81
+ for i in range(num_layers)
82
+ },
83
+ )
84
+
85
+ self.connector = connector.to(speech_encoder_device)
86
+ self.whisper_encoder = whisper.model.encoder.to(speech_encoder_device)
87
+ self.llama_decoder = LlamaForCausalLM.from_pretrained(
88
+ "meta-llama/Meta-Llama-3-8B-Instruct",
89
+ device_map=device_map,
90
+ torch_dtype=torch.float16,
91
+ )
92
+ self.processor = AutoProcessor.from_pretrained("openai/whisper-large-v3")
93
+ self.tokenizer = AutoTokenizer.from_pretrained("WillHeld/via-llama")
94
+ self.prefix = torch.tensor([128000, 128006, 882, 128007, 271]).to(
95
+ self.llama_decoder.model.embed_tokens.weight.device
96
+ )
97
+
98
+ self.pre_user_suffix = torch.tensor(
99
+ self.tokenizer.encode(
100
+ "<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n"
101
+ )
102
+ ).to(self.llama_decoder.model.embed_tokens.weight.device)
103
+ self.final_header = torch.tensor([128009, 128006, 78191, 128007, 271]).to(
104
+ self.llama_decoder.model.embed_tokens.weight.device
105
+ )
106
+ self.speech_encoder_device = speech_encoder_device
107
+
108
+ @classmethod
109
+ def from_pretrained(
110
+ cls,
111
+ pretrained_model_name_or_path: Optional[Union[str, os.PathLike]],
112
+ *model_args,
113
+ config=None,
114
+ cache_dir=None,
115
+ **kwargs,
116
+ ):
117
+ if os.path.isdir(pretrained_model_name_or_path):
118
+ via_path = (
119
+ pretrained_model_name_or_path + "/model-00001-of-00004.safetensors"
120
+ )
121
+ config_path = pretrained_model_name_or_path + "/config.json"
122
+ else:
123
+ # Loading from huggingface repo
124
+ from huggingface_hub import hf_hub_download
125
+
126
+ hf_hub_download(
127
+ repo_id=pretrained_model_name_or_path,
128
+ filename="model-00001-of-00004.safetensors",
129
+ token=kwargs.get("token", None),
130
+ local_dir=os.path.dirname(__file__),
131
+ )
132
+ hf_hub_download(
133
+ repo_id=pretrained_model_name_or_path,
134
+ filename="config.json",
135
+ token=kwargs.get("token", None),
136
+ local_dir=os.path.dirname(__file__),
137
+ )
138
+ via_path = os.path.dirname(__file__) + "/model-00001-of-00004.safetensors"
139
+ config_path = os.path.dirname(__file__) + "/config.json"
140
+ with open(config_path, "r") as f:
141
+ config_dict = json.loads(f.read())
142
+ return cls(
143
+ via_path,
144
+ config_dict,
145
+ kwargs["device_map"] if "device_map" in kwargs else "auto",
146
+ (
147
+ kwargs["speech_encoder_device"]
148
+ if "speech_encoder_device" in kwargs
149
+ else None
150
+ ),
151
+ )
152
+
153
+ def forward(self, audio, prefix_text_tokens, suffix_text_tokens):
154
+ inputs = self.processor(audio, return_tensors="pt", sampling_rate=16_000)
155
+ input_features = inputs.input_features.to(self.speech_encoder_device)
156
+ hidden_states = self.whisper_encoder(input_features=input_features)[
157
+ "last_hidden_state"
158
+ ]
159
+ virt_tokens = self.connector(
160
+ hidden_states,
161
+ output_device=self.llama_decoder.model.embed_tokens.weight.device,
162
+ ).squeeze()
163
+
164
+ prefix_embed = self.llama_decoder.model.embed_tokens(prefix_text_tokens)
165
+ suffix_embed = self.llama_decoder.model.embed_tokens(suffix_text_tokens)
166
+ inputs_embeds = torch.cat(
167
+ [prefix_embed, virt_tokens, suffix_embed], axis=0
168
+ ).unsqueeze(0)
169
+
170
+ outputs = self.llama_decoder(
171
+ inputs_embeds=inputs_embeds.to(
172
+ self.llama_decoder.model.embed_tokens.weight.device
173
+ ).half(),
174
+ return_dict=True,
175
+ output_hidden_states=True,
176
+ past_key_values=past_key_values,
177
+ )
178
+
179
+ return outputs
180
+
181
+ def generate(
182
+ self, audio, text_prompt, do_sample=False, logits_processor=None, max_new_tokens=128
183
+ ):
184
+ inputs = self.processor(audio, return_tensors="pt", sampling_rate=16_000)
185
+ input_features = inputs.input_features.to(self.speech_encoder_device)
186
+ hidden_states = self.whisper_encoder(input_features=input_features)[
187
+ "last_hidden_state"
188
+ ]
189
+ virt_tokens = self.connector(
190
+ hidden_states,
191
+ output_device=self.llama_decoder.model.embed_tokens.weight.device,
192
+ ).squeeze()
193
+
194
+ if text_prompt != None and text_prompt != "":
195
+ user_prompt_text = torch.tensor(
196
+ self.tokenizer(text_prompt, add_special_tokens=False)["input_ids"],
197
+ device=self.pre_user_suffix.device,
198
+ )
199
+ prefix = torch.cat(
200
+ [self.pre_user_suffix, user_prompt_text, self.prefix], axis=0
201
+ )
202
+ else:
203
+ prefix = self.prefix
204
+ prefix_embed = self.llama_decoder.model.embed_tokens(prefix)
205
+ suffix = self.final_header
206
+ suffix_embed = self.llama_decoder.model.embed_tokens(suffix)
207
+ inputs_embeds = torch.cat(
208
+ [prefix_embed, virt_tokens, suffix_embed], axis=0
209
+ ).unsqueeze(0)
210
+ outs = []
211
+ outputs = None
212
+ greedy = 1
213
+ i = 0
214
+ while greedy != 128009 and len(outs) < max_new_tokens:
215
+ past_key_values = outputs.past_key_values if outputs else None
216
+ outputs = self.llama_decoder(
217
+ inputs_embeds=inputs_embeds.to(
218
+ self.llama_decoder.model.embed_tokens.weight.device
219
+ ).half(),
220
+ return_dict=True,
221
+ output_hidden_states=True,
222
+ past_key_values=past_key_values,
223
+ )
224
+ next_token_logits = outputs.logits[-1, -1, :]
225
+
226
+ if logits_processor:
227
+ local_outs = torch.tensor(outs) if outs != [] else suffix
228
+ local_outs = local_outs.reshape(1, -1)
229
+ next_token_logits = logits_processor(
230
+ local_outs,
231
+ next_token_logits.reshape(1, -1),
232
+ )
233
+ next_token_logits = next_token_logits.flatten()
234
+ if do_sample:
235
+ logits = next_token_logits / temperature
236
+ probs = F.softmax(logits, dim=-1)
237
+ greedy = torch.multinomial(probs, num_samples=1)[0]
238
+ else:
239
+ greedy = next_token_logits.argmax()
240
+ outs.append(greedy)
241
+ next_embed = self.llama_decoder.model.embed_tokens(greedy.reshape(1, 1))
242
+ inputs_embeds = next_embed
243
+ return self.tokenizer.decode(outs, skip_special_tokens=True).replace(
244
+ "<|eot_id|>", ""
245
+ )
246
+
247
+ def generate_stream(
248
+ self, audio, text_prompt, do_sample=False, logits_processor=None, max_new_tokens=128
249
+ ):
250
+ inputs = self.processor(audio, return_tensors="pt", sampling_rate=16_000)
251
+ input_features = inputs.input_features
252
+ hidden_states = self.whisper_encoder(input_features=input_features)[
253
+ "last_hidden_state"
254
+ ]
255
+ virt_tokens = self.connector(
256
+ hidden_states,
257
+ output_device=self.llama_decoder.model.embed_tokens.weight.device,
258
+ ).squeeze()
259
+
260
+ if text_prompt != None and text_prompt != "":
261
+ user_prompt_text = torch.tensor(
262
+ self.tokenizer(text_prompt, add_special_tokens=False)["input_ids"],
263
+ device=self.pre_user_suffix.device,
264
+ )
265
+ prefix = torch.cat(
266
+ [self.pre_user_suffix, user_prompt_text, self.prefix], axis=0
267
+ )
268
+ else:
269
+ prefix = self.prefix
270
+ prefix_embed = self.llama_decoder.model.embed_tokens(prefix)
271
+ suffix = self.final_header
272
+ suffix_embed = self.llama_decoder.model.embed_tokens(suffix)
273
+ inputs_embeds = torch.cat(
274
+ [prefix_embed, virt_tokens, suffix_embed], axis=0
275
+ ).unsqueeze(0)
276
+ outs = []
277
+ outputs = None
278
+ greedy = 1
279
+ i = 0
280
+ while greedy != 128009 and len(outs) < max_new_tokens:
281
+ past_key_values = outputs.past_key_values if outputs else None
282
+ outputs = self.llama_decoder(
283
+ inputs_embeds=inputs_embeds,
284
+ return_dict=True,
285
+ output_hidden_states=True,
286
+ past_key_values=past_key_values,
287
+ )
288
+ next_token_logits = outputs.logits[-1, -1, :]
289
+
290
+ if logits_processor:
291
+ local_outs = torch.tensor(outs) if outs != [] else suffix
292
+ local_outs = local_outs.reshape(1, -1)
293
+ next_token_logits = logits_processor(
294
+ local_outs,
295
+ next_token_logits.reshape(1, -1),
296
+ )
297
+ next_token_logits = next_token_logits.flatten()
298
+ if do_sample:
299
+ logits = next_token_logits / temperature
300
+ probs = F.softmax(logits, dim=-1)
301
+ greedy = torch.multinomial(probs, num_samples=1)[0]
302
+ else:
303
+ greedy = next_token_logits.argmax()
304
+ outs.append(greedy)
305
+ next_embed = self.llama_decoder.model.embed_tokens(greedy.reshape(1, 1))
306
+ inputs_embeds = next_embed
307
+ yield self.tokenizer.decode(outs, skip_special_tokens=True).replace("<|eot_id|>", "")
308
+ return self.tokenizer.decode(outs, skip_special_tokens=True).replace("<|eot_id|>", "")
preprocessor_config.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "chunk_length": 30,
3
+ "feature_extractor_type": "WhisperFeatureExtractor",
4
+ "feature_size": 128,
5
+ "hop_length": 160,
6
+ "n_fft": 400,
7
+ "n_samples": 480000,
8
+ "nb_max_frames": 3000,
9
+ "padding_side": "right",
10
+ "padding_value": 0.0,
11
+ "processor_class": "WhisperProcessor",
12
+ "return_attention_mask": false,
13
+ "sampling_rate": 16000
14
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|begin_of_text|>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|eot_id|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "<|reserved_special_token_0|>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ }
23
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,2064 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "128000": {
4
+ "content": "<|begin_of_text|>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "128001": {
12
+ "content": "<|end_of_text|>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "128002": {
20
+ "content": "<|reserved_special_token_0|>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "128003": {
28
+ "content": "<|reserved_special_token_1|>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "128004": {
36
+ "content": "<|reserved_special_token_2|>",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ },
43
+ "128005": {
44
+ "content": "<|reserved_special_token_3|>",
45
+ "lstrip": false,
46
+ "normalized": false,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": true
50
+ },
51
+ "128006": {
52
+ "content": "<|start_header_id|>",
53
+ "lstrip": false,
54
+ "normalized": false,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": true
58
+ },
59
+ "128007": {
60
+ "content": "<|end_header_id|>",
61
+ "lstrip": false,
62
+ "normalized": false,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": true
66
+ },
67
+ "128008": {
68
+ "content": "<|reserved_special_token_4|>",
69
+ "lstrip": false,
70
+ "normalized": false,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": true
74
+ },
75
+ "128009": {
76
+ "content": "<|eot_id|>",
77
+ "lstrip": false,
78
+ "normalized": false,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": true
82
+ },
83
+ "128010": {
84
+ "content": "<|reserved_special_token_5|>",
85
+ "lstrip": false,
86
+ "normalized": false,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": true
90
+ },
91
+ "128011": {
92
+ "content": "<|reserved_special_token_6|>",
93
+ "lstrip": false,
94
+ "normalized": false,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": true
98
+ },
99
+ "128012": {
100
+ "content": "<|reserved_special_token_7|>",
101
+ "lstrip": false,
102
+ "normalized": false,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": true
106
+ },
107
+ "128013": {
108
+ "content": "<|reserved_special_token_8|>",
109
+ "lstrip": false,
110
+ "normalized": false,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": true
114
+ },
115
+ "128014": {
116
+ "content": "<|reserved_special_token_9|>",
117
+ "lstrip": false,
118
+ "normalized": false,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": true
122
+ },
123
+ "128015": {
124
+ "content": "<|reserved_special_token_10|>",
125
+ "lstrip": false,
126
+ "normalized": false,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": true
130
+ },
131
+ "128016": {
132
+ "content": "<|reserved_special_token_11|>",
133
+ "lstrip": false,
134
+ "normalized": false,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": true
138
+ },
139
+ "128017": {
140
+ "content": "<|reserved_special_token_12|>",
141
+ "lstrip": false,
142
+ "normalized": false,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": true
146
+ },
147
+ "128018": {
148
+ "content": "<|reserved_special_token_13|>",
149
+ "lstrip": false,
150
+ "normalized": false,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": true
154
+ },
155
+ "128019": {
156
+ "content": "<|reserved_special_token_14|>",
157
+ "lstrip": false,
158
+ "normalized": false,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": true
162
+ },
163
+ "128020": {
164
+ "content": "<|reserved_special_token_15|>",
165
+ "lstrip": false,
166
+ "normalized": false,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": true
170
+ },
171
+ "128021": {
172
+ "content": "<|reserved_special_token_16|>",
173
+ "lstrip": false,
174
+ "normalized": false,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": true
178
+ },
179
+ "128022": {
180
+ "content": "<|reserved_special_token_17|>",
181
+ "lstrip": false,
182
+ "normalized": false,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": true
186
+ },
187
+ "128023": {
188
+ "content": "<|reserved_special_token_18|>",
189
+ "lstrip": false,
190
+ "normalized": false,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": true
194
+ },
195
+ "128024": {
196
+ "content": "<|reserved_special_token_19|>",
197
+ "lstrip": false,
198
+ "normalized": false,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": true
202
+ },
203
+ "128025": {
204
+ "content": "<|reserved_special_token_20|>",
205
+ "lstrip": false,
206
+ "normalized": false,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": true
210
+ },
211
+ "128026": {
212
+ "content": "<|reserved_special_token_21|>",
213
+ "lstrip": false,
214
+ "normalized": false,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": true
218
+ },
219
+ "128027": {
220
+ "content": "<|reserved_special_token_22|>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "128028": {
228
+ "content": "<|reserved_special_token_23|>",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "128029": {
236
+ "content": "<|reserved_special_token_24|>",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "128030": {
244
+ "content": "<|reserved_special_token_25|>",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "128031": {
252
+ "content": "<|reserved_special_token_26|>",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "128032": {
260
+ "content": "<|reserved_special_token_27|>",
261
+ "lstrip": false,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "128033": {
268
+ "content": "<|reserved_special_token_28|>",
269
+ "lstrip": false,
270
+ "normalized": false,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": true
274
+ },
275
+ "128034": {
276
+ "content": "<|reserved_special_token_29|>",
277
+ "lstrip": false,
278
+ "normalized": false,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": true
282
+ },
283
+ "128035": {
284
+ "content": "<|reserved_special_token_30|>",
285
+ "lstrip": false,
286
+ "normalized": false,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": true
290
+ },
291
+ "128036": {
292
+ "content": "<|reserved_special_token_31|>",
293
+ "lstrip": false,
294
+ "normalized": false,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": true
298
+ },
299
+ "128037": {
300
+ "content": "<|reserved_special_token_32|>",
301
+ "lstrip": false,
302
+ "normalized": false,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": true
306
+ },
307
+ "128038": {
308
+ "content": "<|reserved_special_token_33|>",
309
+ "lstrip": false,
310
+ "normalized": false,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": true
314
+ },
315
+ "128039": {
316
+ "content": "<|reserved_special_token_34|>",
317
+ "lstrip": false,
318
+ "normalized": false,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": true
322
+ },
323
+ "128040": {
324
+ "content": "<|reserved_special_token_35|>",
325
+ "lstrip": false,
326
+ "normalized": false,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": true
330
+ },
331
+ "128041": {
332
+ "content": "<|reserved_special_token_36|>",
333
+ "lstrip": false,
334
+ "normalized": false,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": true
338
+ },
339
+ "128042": {
340
+ "content": "<|reserved_special_token_37|>",
341
+ "lstrip": false,
342
+ "normalized": false,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": true
346
+ },
347
+ "128043": {
348
+ "content": "<|reserved_special_token_38|>",
349
+ "lstrip": false,
350
+ "normalized": false,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": true
354
+ },
355
+ "128044": {
356
+ "content": "<|reserved_special_token_39|>",
357
+ "lstrip": false,
358
+ "normalized": false,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": true
362
+ },
363
+ "128045": {
364
+ "content": "<|reserved_special_token_40|>",
365
+ "lstrip": false,
366
+ "normalized": false,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": true
370
+ },
371
+ "128046": {
372
+ "content": "<|reserved_special_token_41|>",
373
+ "lstrip": false,
374
+ "normalized": false,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": true
378
+ },
379
+ "128047": {
380
+ "content": "<|reserved_special_token_42|>",
381
+ "lstrip": false,
382
+ "normalized": false,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": true
386
+ },
387
+ "128048": {
388
+ "content": "<|reserved_special_token_43|>",
389
+ "lstrip": false,
390
+ "normalized": false,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": true
394
+ },
395
+ "128049": {
396
+ "content": "<|reserved_special_token_44|>",
397
+ "lstrip": false,
398
+ "normalized": false,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": true
402
+ },
403
+ "128050": {
404
+ "content": "<|reserved_special_token_45|>",
405
+ "lstrip": false,
406
+ "normalized": false,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": true
410
+ },
411
+ "128051": {
412
+ "content": "<|reserved_special_token_46|>",
413
+ "lstrip": false,
414
+ "normalized": false,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": true
418
+ },
419
+ "128052": {
420
+ "content": "<|reserved_special_token_47|>",
421
+ "lstrip": false,
422
+ "normalized": false,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": true
426
+ },
427
+ "128053": {
428
+ "content": "<|reserved_special_token_48|>",
429
+ "lstrip": false,
430
+ "normalized": false,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": true
434
+ },
435
+ "128054": {
436
+ "content": "<|reserved_special_token_49|>",
437
+ "lstrip": false,
438
+ "normalized": false,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": true
442
+ },
443
+ "128055": {
444
+ "content": "<|reserved_special_token_50|>",
445
+ "lstrip": false,
446
+ "normalized": false,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": true
450
+ },
451
+ "128056": {
452
+ "content": "<|reserved_special_token_51|>",
453
+ "lstrip": false,
454
+ "normalized": false,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": true
458
+ },
459
+ "128057": {
460
+ "content": "<|reserved_special_token_52|>",
461
+ "lstrip": false,
462
+ "normalized": false,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": true
466
+ },
467
+ "128058": {
468
+ "content": "<|reserved_special_token_53|>",
469
+ "lstrip": false,
470
+ "normalized": false,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": true
474
+ },
475
+ "128059": {
476
+ "content": "<|reserved_special_token_54|>",
477
+ "lstrip": false,
478
+ "normalized": false,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": true
482
+ },
483
+ "128060": {
484
+ "content": "<|reserved_special_token_55|>",
485
+ "lstrip": false,
486
+ "normalized": false,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": true
490
+ },
491
+ "128061": {
492
+ "content": "<|reserved_special_token_56|>",
493
+ "lstrip": false,
494
+ "normalized": false,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": true
498
+ },
499
+ "128062": {
500
+ "content": "<|reserved_special_token_57|>",
501
+ "lstrip": false,
502
+ "normalized": false,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": true
506
+ },
507
+ "128063": {
508
+ "content": "<|reserved_special_token_58|>",
509
+ "lstrip": false,
510
+ "normalized": false,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": true
514
+ },
515
+ "128064": {
516
+ "content": "<|reserved_special_token_59|>",
517
+ "lstrip": false,
518
+ "normalized": false,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": true
522
+ },
523
+ "128065": {
524
+ "content": "<|reserved_special_token_60|>",
525
+ "lstrip": false,
526
+ "normalized": false,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": true
530
+ },
531
+ "128066": {
532
+ "content": "<|reserved_special_token_61|>",
533
+ "lstrip": false,
534
+ "normalized": false,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": true
538
+ },
539
+ "128067": {
540
+ "content": "<|reserved_special_token_62|>",
541
+ "lstrip": false,
542
+ "normalized": false,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": true
546
+ },
547
+ "128068": {
548
+ "content": "<|reserved_special_token_63|>",
549
+ "lstrip": false,
550
+ "normalized": false,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": true
554
+ },
555
+ "128069": {
556
+ "content": "<|reserved_special_token_64|>",
557
+ "lstrip": false,
558
+ "normalized": false,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": true
562
+ },
563
+ "128070": {
564
+ "content": "<|reserved_special_token_65|>",
565
+ "lstrip": false,
566
+ "normalized": false,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": true
570
+ },
571
+ "128071": {
572
+ "content": "<|reserved_special_token_66|>",
573
+ "lstrip": false,
574
+ "normalized": false,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": true
578
+ },
579
+ "128072": {
580
+ "content": "<|reserved_special_token_67|>",
581
+ "lstrip": false,
582
+ "normalized": false,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": true
586
+ },
587
+ "128073": {
588
+ "content": "<|reserved_special_token_68|>",
589
+ "lstrip": false,
590
+ "normalized": false,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": true
594
+ },
595
+ "128074": {
596
+ "content": "<|reserved_special_token_69|>",
597
+ "lstrip": false,
598
+ "normalized": false,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": true
602
+ },
603
+ "128075": {
604
+ "content": "<|reserved_special_token_70|>",
605
+ "lstrip": false,
606
+ "normalized": false,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": true
610
+ },
611
+ "128076": {
612
+ "content": "<|reserved_special_token_71|>",
613
+ "lstrip": false,
614
+ "normalized": false,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": true
618
+ },
619
+ "128077": {
620
+ "content": "<|reserved_special_token_72|>",
621
+ "lstrip": false,
622
+ "normalized": false,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": true
626
+ },
627
+ "128078": {
628
+ "content": "<|reserved_special_token_73|>",
629
+ "lstrip": false,
630
+ "normalized": false,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": true
634
+ },
635
+ "128079": {
636
+ "content": "<|reserved_special_token_74|>",
637
+ "lstrip": false,
638
+ "normalized": false,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": true
642
+ },
643
+ "128080": {
644
+ "content": "<|reserved_special_token_75|>",
645
+ "lstrip": false,
646
+ "normalized": false,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": true
650
+ },
651
+ "128081": {
652
+ "content": "<|reserved_special_token_76|>",
653
+ "lstrip": false,
654
+ "normalized": false,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": true
658
+ },
659
+ "128082": {
660
+ "content": "<|reserved_special_token_77|>",
661
+ "lstrip": false,
662
+ "normalized": false,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": true
666
+ },
667
+ "128083": {
668
+ "content": "<|reserved_special_token_78|>",
669
+ "lstrip": false,
670
+ "normalized": false,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": true
674
+ },
675
+ "128084": {
676
+ "content": "<|reserved_special_token_79|>",
677
+ "lstrip": false,
678
+ "normalized": false,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": true
682
+ },
683
+ "128085": {
684
+ "content": "<|reserved_special_token_80|>",
685
+ "lstrip": false,
686
+ "normalized": false,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": true
690
+ },
691
+ "128086": {
692
+ "content": "<|reserved_special_token_81|>",
693
+ "lstrip": false,
694
+ "normalized": false,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": true
698
+ },
699
+ "128087": {
700
+ "content": "<|reserved_special_token_82|>",
701
+ "lstrip": false,
702
+ "normalized": false,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": true
706
+ },
707
+ "128088": {
708
+ "content": "<|reserved_special_token_83|>",
709
+ "lstrip": false,
710
+ "normalized": false,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": true
714
+ },
715
+ "128089": {
716
+ "content": "<|reserved_special_token_84|>",
717
+ "lstrip": false,
718
+ "normalized": false,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": true
722
+ },
723
+ "128090": {
724
+ "content": "<|reserved_special_token_85|>",
725
+ "lstrip": false,
726
+ "normalized": false,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": true
730
+ },
731
+ "128091": {
732
+ "content": "<|reserved_special_token_86|>",
733
+ "lstrip": false,
734
+ "normalized": false,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": true
738
+ },
739
+ "128092": {
740
+ "content": "<|reserved_special_token_87|>",
741
+ "lstrip": false,
742
+ "normalized": false,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": true
746
+ },
747
+ "128093": {
748
+ "content": "<|reserved_special_token_88|>",
749
+ "lstrip": false,
750
+ "normalized": false,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": true
754
+ },
755
+ "128094": {
756
+ "content": "<|reserved_special_token_89|>",
757
+ "lstrip": false,
758
+ "normalized": false,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": true
762
+ },
763
+ "128095": {
764
+ "content": "<|reserved_special_token_90|>",
765
+ "lstrip": false,
766
+ "normalized": false,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": true
770
+ },
771
+ "128096": {
772
+ "content": "<|reserved_special_token_91|>",
773
+ "lstrip": false,
774
+ "normalized": false,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": true
778
+ },
779
+ "128097": {
780
+ "content": "<|reserved_special_token_92|>",
781
+ "lstrip": false,
782
+ "normalized": false,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": true
786
+ },
787
+ "128098": {
788
+ "content": "<|reserved_special_token_93|>",
789
+ "lstrip": false,
790
+ "normalized": false,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": true
794
+ },
795
+ "128099": {
796
+ "content": "<|reserved_special_token_94|>",
797
+ "lstrip": false,
798
+ "normalized": false,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": true
802
+ },
803
+ "128100": {
804
+ "content": "<|reserved_special_token_95|>",
805
+ "lstrip": false,
806
+ "normalized": false,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": true
810
+ },
811
+ "128101": {
812
+ "content": "<|reserved_special_token_96|>",
813
+ "lstrip": false,
814
+ "normalized": false,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": true
818
+ },
819
+ "128102": {
820
+ "content": "<|reserved_special_token_97|>",
821
+ "lstrip": false,
822
+ "normalized": false,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": true
826
+ },
827
+ "128103": {
828
+ "content": "<|reserved_special_token_98|>",
829
+ "lstrip": false,
830
+ "normalized": false,
831
+ "rstrip": false,
832
+ "single_word": false,
833
+ "special": true
834
+ },
835
+ "128104": {
836
+ "content": "<|reserved_special_token_99|>",
837
+ "lstrip": false,
838
+ "normalized": false,
839
+ "rstrip": false,
840
+ "single_word": false,
841
+ "special": true
842
+ },
843
+ "128105": {
844
+ "content": "<|reserved_special_token_100|>",
845
+ "lstrip": false,
846
+ "normalized": false,
847
+ "rstrip": false,
848
+ "single_word": false,
849
+ "special": true
850
+ },
851
+ "128106": {
852
+ "content": "<|reserved_special_token_101|>",
853
+ "lstrip": false,
854
+ "normalized": false,
855
+ "rstrip": false,
856
+ "single_word": false,
857
+ "special": true
858
+ },
859
+ "128107": {
860
+ "content": "<|reserved_special_token_102|>",
861
+ "lstrip": false,
862
+ "normalized": false,
863
+ "rstrip": false,
864
+ "single_word": false,
865
+ "special": true
866
+ },
867
+ "128108": {
868
+ "content": "<|reserved_special_token_103|>",
869
+ "lstrip": false,
870
+ "normalized": false,
871
+ "rstrip": false,
872
+ "single_word": false,
873
+ "special": true
874
+ },
875
+ "128109": {
876
+ "content": "<|reserved_special_token_104|>",
877
+ "lstrip": false,
878
+ "normalized": false,
879
+ "rstrip": false,
880
+ "single_word": false,
881
+ "special": true
882
+ },
883
+ "128110": {
884
+ "content": "<|reserved_special_token_105|>",
885
+ "lstrip": false,
886
+ "normalized": false,
887
+ "rstrip": false,
888
+ "single_word": false,
889
+ "special": true
890
+ },
891
+ "128111": {
892
+ "content": "<|reserved_special_token_106|>",
893
+ "lstrip": false,
894
+ "normalized": false,
895
+ "rstrip": false,
896
+ "single_word": false,
897
+ "special": true
898
+ },
899
+ "128112": {
900
+ "content": "<|reserved_special_token_107|>",
901
+ "lstrip": false,
902
+ "normalized": false,
903
+ "rstrip": false,
904
+ "single_word": false,
905
+ "special": true
906
+ },
907
+ "128113": {
908
+ "content": "<|reserved_special_token_108|>",
909
+ "lstrip": false,
910
+ "normalized": false,
911
+ "rstrip": false,
912
+ "single_word": false,
913
+ "special": true
914
+ },
915
+ "128114": {
916
+ "content": "<|reserved_special_token_109|>",
917
+ "lstrip": false,
918
+ "normalized": false,
919
+ "rstrip": false,
920
+ "single_word": false,
921
+ "special": true
922
+ },
923
+ "128115": {
924
+ "content": "<|reserved_special_token_110|>",
925
+ "lstrip": false,
926
+ "normalized": false,
927
+ "rstrip": false,
928
+ "single_word": false,
929
+ "special": true
930
+ },
931
+ "128116": {
932
+ "content": "<|reserved_special_token_111|>",
933
+ "lstrip": false,
934
+ "normalized": false,
935
+ "rstrip": false,
936
+ "single_word": false,
937
+ "special": true
938
+ },
939
+ "128117": {
940
+ "content": "<|reserved_special_token_112|>",
941
+ "lstrip": false,
942
+ "normalized": false,
943
+ "rstrip": false,
944
+ "single_word": false,
945
+ "special": true
946
+ },
947
+ "128118": {
948
+ "content": "<|reserved_special_token_113|>",
949
+ "lstrip": false,
950
+ "normalized": false,
951
+ "rstrip": false,
952
+ "single_word": false,
953
+ "special": true
954
+ },
955
+ "128119": {
956
+ "content": "<|reserved_special_token_114|>",
957
+ "lstrip": false,
958
+ "normalized": false,
959
+ "rstrip": false,
960
+ "single_word": false,
961
+ "special": true
962
+ },
963
+ "128120": {
964
+ "content": "<|reserved_special_token_115|>",
965
+ "lstrip": false,
966
+ "normalized": false,
967
+ "rstrip": false,
968
+ "single_word": false,
969
+ "special": true
970
+ },
971
+ "128121": {
972
+ "content": "<|reserved_special_token_116|>",
973
+ "lstrip": false,
974
+ "normalized": false,
975
+ "rstrip": false,
976
+ "single_word": false,
977
+ "special": true
978
+ },
979
+ "128122": {
980
+ "content": "<|reserved_special_token_117|>",
981
+ "lstrip": false,
982
+ "normalized": false,
983
+ "rstrip": false,
984
+ "single_word": false,
985
+ "special": true
986
+ },
987
+ "128123": {
988
+ "content": "<|reserved_special_token_118|>",
989
+ "lstrip": false,
990
+ "normalized": false,
991
+ "rstrip": false,
992
+ "single_word": false,
993
+ "special": true
994
+ },
995
+ "128124": {
996
+ "content": "<|reserved_special_token_119|>",
997
+ "lstrip": false,
998
+ "normalized": false,
999
+ "rstrip": false,
1000
+ "single_word": false,
1001
+ "special": true
1002
+ },
1003
+ "128125": {
1004
+ "content": "<|reserved_special_token_120|>",
1005
+ "lstrip": false,
1006
+ "normalized": false,
1007
+ "rstrip": false,
1008
+ "single_word": false,
1009
+ "special": true
1010
+ },
1011
+ "128126": {
1012
+ "content": "<|reserved_special_token_121|>",
1013
+ "lstrip": false,
1014
+ "normalized": false,
1015
+ "rstrip": false,
1016
+ "single_word": false,
1017
+ "special": true
1018
+ },
1019
+ "128127": {
1020
+ "content": "<|reserved_special_token_122|>",
1021
+ "lstrip": false,
1022
+ "normalized": false,
1023
+ "rstrip": false,
1024
+ "single_word": false,
1025
+ "special": true
1026
+ },
1027
+ "128128": {
1028
+ "content": "<|reserved_special_token_123|>",
1029
+ "lstrip": false,
1030
+ "normalized": false,
1031
+ "rstrip": false,
1032
+ "single_word": false,
1033
+ "special": true
1034
+ },
1035
+ "128129": {
1036
+ "content": "<|reserved_special_token_124|>",
1037
+ "lstrip": false,
1038
+ "normalized": false,
1039
+ "rstrip": false,
1040
+ "single_word": false,
1041
+ "special": true
1042
+ },
1043
+ "128130": {
1044
+ "content": "<|reserved_special_token_125|>",
1045
+ "lstrip": false,
1046
+ "normalized": false,
1047
+ "rstrip": false,
1048
+ "single_word": false,
1049
+ "special": true
1050
+ },
1051
+ "128131": {
1052
+ "content": "<|reserved_special_token_126|>",
1053
+ "lstrip": false,
1054
+ "normalized": false,
1055
+ "rstrip": false,
1056
+ "single_word": false,
1057
+ "special": true
1058
+ },
1059
+ "128132": {
1060
+ "content": "<|reserved_special_token_127|>",
1061
+ "lstrip": false,
1062
+ "normalized": false,
1063
+ "rstrip": false,
1064
+ "single_word": false,
1065
+ "special": true
1066
+ },
1067
+ "128133": {
1068
+ "content": "<|reserved_special_token_128|>",
1069
+ "lstrip": false,
1070
+ "normalized": false,
1071
+ "rstrip": false,
1072
+ "single_word": false,
1073
+ "special": true
1074
+ },
1075
+ "128134": {
1076
+ "content": "<|reserved_special_token_129|>",
1077
+ "lstrip": false,
1078
+ "normalized": false,
1079
+ "rstrip": false,
1080
+ "single_word": false,
1081
+ "special": true
1082
+ },
1083
+ "128135": {
1084
+ "content": "<|reserved_special_token_130|>",
1085
+ "lstrip": false,
1086
+ "normalized": false,
1087
+ "rstrip": false,
1088
+ "single_word": false,
1089
+ "special": true
1090
+ },
1091
+ "128136": {
1092
+ "content": "<|reserved_special_token_131|>",
1093
+ "lstrip": false,
1094
+ "normalized": false,
1095
+ "rstrip": false,
1096
+ "single_word": false,
1097
+ "special": true
1098
+ },
1099
+ "128137": {
1100
+ "content": "<|reserved_special_token_132|>",
1101
+ "lstrip": false,
1102
+ "normalized": false,
1103
+ "rstrip": false,
1104
+ "single_word": false,
1105
+ "special": true
1106
+ },
1107
+ "128138": {
1108
+ "content": "<|reserved_special_token_133|>",
1109
+ "lstrip": false,
1110
+ "normalized": false,
1111
+ "rstrip": false,
1112
+ "single_word": false,
1113
+ "special": true
1114
+ },
1115
+ "128139": {
1116
+ "content": "<|reserved_special_token_134|>",
1117
+ "lstrip": false,
1118
+ "normalized": false,
1119
+ "rstrip": false,
1120
+ "single_word": false,
1121
+ "special": true
1122
+ },
1123
+ "128140": {
1124
+ "content": "<|reserved_special_token_135|>",
1125
+ "lstrip": false,
1126
+ "normalized": false,
1127
+ "rstrip": false,
1128
+ "single_word": false,
1129
+ "special": true
1130
+ },
1131
+ "128141": {
1132
+ "content": "<|reserved_special_token_136|>",
1133
+ "lstrip": false,
1134
+ "normalized": false,
1135
+ "rstrip": false,
1136
+ "single_word": false,
1137
+ "special": true
1138
+ },
1139
+ "128142": {
1140
+ "content": "<|reserved_special_token_137|>",
1141
+ "lstrip": false,
1142
+ "normalized": false,
1143
+ "rstrip": false,
1144
+ "single_word": false,
1145
+ "special": true
1146
+ },
1147
+ "128143": {
1148
+ "content": "<|reserved_special_token_138|>",
1149
+ "lstrip": false,
1150
+ "normalized": false,
1151
+ "rstrip": false,
1152
+ "single_word": false,
1153
+ "special": true
1154
+ },
1155
+ "128144": {
1156
+ "content": "<|reserved_special_token_139|>",
1157
+ "lstrip": false,
1158
+ "normalized": false,
1159
+ "rstrip": false,
1160
+ "single_word": false,
1161
+ "special": true
1162
+ },
1163
+ "128145": {
1164
+ "content": "<|reserved_special_token_140|>",
1165
+ "lstrip": false,
1166
+ "normalized": false,
1167
+ "rstrip": false,
1168
+ "single_word": false,
1169
+ "special": true
1170
+ },
1171
+ "128146": {
1172
+ "content": "<|reserved_special_token_141|>",
1173
+ "lstrip": false,
1174
+ "normalized": false,
1175
+ "rstrip": false,
1176
+ "single_word": false,
1177
+ "special": true
1178
+ },
1179
+ "128147": {
1180
+ "content": "<|reserved_special_token_142|>",
1181
+ "lstrip": false,
1182
+ "normalized": false,
1183
+ "rstrip": false,
1184
+ "single_word": false,
1185
+ "special": true
1186
+ },
1187
+ "128148": {
1188
+ "content": "<|reserved_special_token_143|>",
1189
+ "lstrip": false,
1190
+ "normalized": false,
1191
+ "rstrip": false,
1192
+ "single_word": false,
1193
+ "special": true
1194
+ },
1195
+ "128149": {
1196
+ "content": "<|reserved_special_token_144|>",
1197
+ "lstrip": false,
1198
+ "normalized": false,
1199
+ "rstrip": false,
1200
+ "single_word": false,
1201
+ "special": true
1202
+ },
1203
+ "128150": {
1204
+ "content": "<|reserved_special_token_145|>",
1205
+ "lstrip": false,
1206
+ "normalized": false,
1207
+ "rstrip": false,
1208
+ "single_word": false,
1209
+ "special": true
1210
+ },
1211
+ "128151": {
1212
+ "content": "<|reserved_special_token_146|>",
1213
+ "lstrip": false,
1214
+ "normalized": false,
1215
+ "rstrip": false,
1216
+ "single_word": false,
1217
+ "special": true
1218
+ },
1219
+ "128152": {
1220
+ "content": "<|reserved_special_token_147|>",
1221
+ "lstrip": false,
1222
+ "normalized": false,
1223
+ "rstrip": false,
1224
+ "single_word": false,
1225
+ "special": true
1226
+ },
1227
+ "128153": {
1228
+ "content": "<|reserved_special_token_148|>",
1229
+ "lstrip": false,
1230
+ "normalized": false,
1231
+ "rstrip": false,
1232
+ "single_word": false,
1233
+ "special": true
1234
+ },
1235
+ "128154": {
1236
+ "content": "<|reserved_special_token_149|>",
1237
+ "lstrip": false,
1238
+ "normalized": false,
1239
+ "rstrip": false,
1240
+ "single_word": false,
1241
+ "special": true
1242
+ },
1243
+ "128155": {
1244
+ "content": "<|reserved_special_token_150|>",
1245
+ "lstrip": false,
1246
+ "normalized": false,
1247
+ "rstrip": false,
1248
+ "single_word": false,
1249
+ "special": true
1250
+ },
1251
+ "128156": {
1252
+ "content": "<|reserved_special_token_151|>",
1253
+ "lstrip": false,
1254
+ "normalized": false,
1255
+ "rstrip": false,
1256
+ "single_word": false,
1257
+ "special": true
1258
+ },
1259
+ "128157": {
1260
+ "content": "<|reserved_special_token_152|>",
1261
+ "lstrip": false,
1262
+ "normalized": false,
1263
+ "rstrip": false,
1264
+ "single_word": false,
1265
+ "special": true
1266
+ },
1267
+ "128158": {
1268
+ "content": "<|reserved_special_token_153|>",
1269
+ "lstrip": false,
1270
+ "normalized": false,
1271
+ "rstrip": false,
1272
+ "single_word": false,
1273
+ "special": true
1274
+ },
1275
+ "128159": {
1276
+ "content": "<|reserved_special_token_154|>",
1277
+ "lstrip": false,
1278
+ "normalized": false,
1279
+ "rstrip": false,
1280
+ "single_word": false,
1281
+ "special": true
1282
+ },
1283
+ "128160": {
1284
+ "content": "<|reserved_special_token_155|>",
1285
+ "lstrip": false,
1286
+ "normalized": false,
1287
+ "rstrip": false,
1288
+ "single_word": false,
1289
+ "special": true
1290
+ },
1291
+ "128161": {
1292
+ "content": "<|reserved_special_token_156|>",
1293
+ "lstrip": false,
1294
+ "normalized": false,
1295
+ "rstrip": false,
1296
+ "single_word": false,
1297
+ "special": true
1298
+ },
1299
+ "128162": {
1300
+ "content": "<|reserved_special_token_157|>",
1301
+ "lstrip": false,
1302
+ "normalized": false,
1303
+ "rstrip": false,
1304
+ "single_word": false,
1305
+ "special": true
1306
+ },
1307
+ "128163": {
1308
+ "content": "<|reserved_special_token_158|>",
1309
+ "lstrip": false,
1310
+ "normalized": false,
1311
+ "rstrip": false,
1312
+ "single_word": false,
1313
+ "special": true
1314
+ },
1315
+ "128164": {
1316
+ "content": "<|reserved_special_token_159|>",
1317
+ "lstrip": false,
1318
+ "normalized": false,
1319
+ "rstrip": false,
1320
+ "single_word": false,
1321
+ "special": true
1322
+ },
1323
+ "128165": {
1324
+ "content": "<|reserved_special_token_160|>",
1325
+ "lstrip": false,
1326
+ "normalized": false,
1327
+ "rstrip": false,
1328
+ "single_word": false,
1329
+ "special": true
1330
+ },
1331
+ "128166": {
1332
+ "content": "<|reserved_special_token_161|>",
1333
+ "lstrip": false,
1334
+ "normalized": false,
1335
+ "rstrip": false,
1336
+ "single_word": false,
1337
+ "special": true
1338
+ },
1339
+ "128167": {
1340
+ "content": "<|reserved_special_token_162|>",
1341
+ "lstrip": false,
1342
+ "normalized": false,
1343
+ "rstrip": false,
1344
+ "single_word": false,
1345
+ "special": true
1346
+ },
1347
+ "128168": {
1348
+ "content": "<|reserved_special_token_163|>",
1349
+ "lstrip": false,
1350
+ "normalized": false,
1351
+ "rstrip": false,
1352
+ "single_word": false,
1353
+ "special": true
1354
+ },
1355
+ "128169": {
1356
+ "content": "<|reserved_special_token_164|>",
1357
+ "lstrip": false,
1358
+ "normalized": false,
1359
+ "rstrip": false,
1360
+ "single_word": false,
1361
+ "special": true
1362
+ },
1363
+ "128170": {
1364
+ "content": "<|reserved_special_token_165|>",
1365
+ "lstrip": false,
1366
+ "normalized": false,
1367
+ "rstrip": false,
1368
+ "single_word": false,
1369
+ "special": true
1370
+ },
1371
+ "128171": {
1372
+ "content": "<|reserved_special_token_166|>",
1373
+ "lstrip": false,
1374
+ "normalized": false,
1375
+ "rstrip": false,
1376
+ "single_word": false,
1377
+ "special": true
1378
+ },
1379
+ "128172": {
1380
+ "content": "<|reserved_special_token_167|>",
1381
+ "lstrip": false,
1382
+ "normalized": false,
1383
+ "rstrip": false,
1384
+ "single_word": false,
1385
+ "special": true
1386
+ },
1387
+ "128173": {
1388
+ "content": "<|reserved_special_token_168|>",
1389
+ "lstrip": false,
1390
+ "normalized": false,
1391
+ "rstrip": false,
1392
+ "single_word": false,
1393
+ "special": true
1394
+ },
1395
+ "128174": {
1396
+ "content": "<|reserved_special_token_169|>",
1397
+ "lstrip": false,
1398
+ "normalized": false,
1399
+ "rstrip": false,
1400
+ "single_word": false,
1401
+ "special": true
1402
+ },
1403
+ "128175": {
1404
+ "content": "<|reserved_special_token_170|>",
1405
+ "lstrip": false,
1406
+ "normalized": false,
1407
+ "rstrip": false,
1408
+ "single_word": false,
1409
+ "special": true
1410
+ },
1411
+ "128176": {
1412
+ "content": "<|reserved_special_token_171|>",
1413
+ "lstrip": false,
1414
+ "normalized": false,
1415
+ "rstrip": false,
1416
+ "single_word": false,
1417
+ "special": true
1418
+ },
1419
+ "128177": {
1420
+ "content": "<|reserved_special_token_172|>",
1421
+ "lstrip": false,
1422
+ "normalized": false,
1423
+ "rstrip": false,
1424
+ "single_word": false,
1425
+ "special": true
1426
+ },
1427
+ "128178": {
1428
+ "content": "<|reserved_special_token_173|>",
1429
+ "lstrip": false,
1430
+ "normalized": false,
1431
+ "rstrip": false,
1432
+ "single_word": false,
1433
+ "special": true
1434
+ },
1435
+ "128179": {
1436
+ "content": "<|reserved_special_token_174|>",
1437
+ "lstrip": false,
1438
+ "normalized": false,
1439
+ "rstrip": false,
1440
+ "single_word": false,
1441
+ "special": true
1442
+ },
1443
+ "128180": {
1444
+ "content": "<|reserved_special_token_175|>",
1445
+ "lstrip": false,
1446
+ "normalized": false,
1447
+ "rstrip": false,
1448
+ "single_word": false,
1449
+ "special": true
1450
+ },
1451
+ "128181": {
1452
+ "content": "<|reserved_special_token_176|>",
1453
+ "lstrip": false,
1454
+ "normalized": false,
1455
+ "rstrip": false,
1456
+ "single_word": false,
1457
+ "special": true
1458
+ },
1459
+ "128182": {
1460
+ "content": "<|reserved_special_token_177|>",
1461
+ "lstrip": false,
1462
+ "normalized": false,
1463
+ "rstrip": false,
1464
+ "single_word": false,
1465
+ "special": true
1466
+ },
1467
+ "128183": {
1468
+ "content": "<|reserved_special_token_178|>",
1469
+ "lstrip": false,
1470
+ "normalized": false,
1471
+ "rstrip": false,
1472
+ "single_word": false,
1473
+ "special": true
1474
+ },
1475
+ "128184": {
1476
+ "content": "<|reserved_special_token_179|>",
1477
+ "lstrip": false,
1478
+ "normalized": false,
1479
+ "rstrip": false,
1480
+ "single_word": false,
1481
+ "special": true
1482
+ },
1483
+ "128185": {
1484
+ "content": "<|reserved_special_token_180|>",
1485
+ "lstrip": false,
1486
+ "normalized": false,
1487
+ "rstrip": false,
1488
+ "single_word": false,
1489
+ "special": true
1490
+ },
1491
+ "128186": {
1492
+ "content": "<|reserved_special_token_181|>",
1493
+ "lstrip": false,
1494
+ "normalized": false,
1495
+ "rstrip": false,
1496
+ "single_word": false,
1497
+ "special": true
1498
+ },
1499
+ "128187": {
1500
+ "content": "<|reserved_special_token_182|>",
1501
+ "lstrip": false,
1502
+ "normalized": false,
1503
+ "rstrip": false,
1504
+ "single_word": false,
1505
+ "special": true
1506
+ },
1507
+ "128188": {
1508
+ "content": "<|reserved_special_token_183|>",
1509
+ "lstrip": false,
1510
+ "normalized": false,
1511
+ "rstrip": false,
1512
+ "single_word": false,
1513
+ "special": true
1514
+ },
1515
+ "128189": {
1516
+ "content": "<|reserved_special_token_184|>",
1517
+ "lstrip": false,
1518
+ "normalized": false,
1519
+ "rstrip": false,
1520
+ "single_word": false,
1521
+ "special": true
1522
+ },
1523
+ "128190": {
1524
+ "content": "<|reserved_special_token_185|>",
1525
+ "lstrip": false,
1526
+ "normalized": false,
1527
+ "rstrip": false,
1528
+ "single_word": false,
1529
+ "special": true
1530
+ },
1531
+ "128191": {
1532
+ "content": "<|reserved_special_token_186|>",
1533
+ "lstrip": false,
1534
+ "normalized": false,
1535
+ "rstrip": false,
1536
+ "single_word": false,
1537
+ "special": true
1538
+ },
1539
+ "128192": {
1540
+ "content": "<|reserved_special_token_187|>",
1541
+ "lstrip": false,
1542
+ "normalized": false,
1543
+ "rstrip": false,
1544
+ "single_word": false,
1545
+ "special": true
1546
+ },
1547
+ "128193": {
1548
+ "content": "<|reserved_special_token_188|>",
1549
+ "lstrip": false,
1550
+ "normalized": false,
1551
+ "rstrip": false,
1552
+ "single_word": false,
1553
+ "special": true
1554
+ },
1555
+ "128194": {
1556
+ "content": "<|reserved_special_token_189|>",
1557
+ "lstrip": false,
1558
+ "normalized": false,
1559
+ "rstrip": false,
1560
+ "single_word": false,
1561
+ "special": true
1562
+ },
1563
+ "128195": {
1564
+ "content": "<|reserved_special_token_190|>",
1565
+ "lstrip": false,
1566
+ "normalized": false,
1567
+ "rstrip": false,
1568
+ "single_word": false,
1569
+ "special": true
1570
+ },
1571
+ "128196": {
1572
+ "content": "<|reserved_special_token_191|>",
1573
+ "lstrip": false,
1574
+ "normalized": false,
1575
+ "rstrip": false,
1576
+ "single_word": false,
1577
+ "special": true
1578
+ },
1579
+ "128197": {
1580
+ "content": "<|reserved_special_token_192|>",
1581
+ "lstrip": false,
1582
+ "normalized": false,
1583
+ "rstrip": false,
1584
+ "single_word": false,
1585
+ "special": true
1586
+ },
1587
+ "128198": {
1588
+ "content": "<|reserved_special_token_193|>",
1589
+ "lstrip": false,
1590
+ "normalized": false,
1591
+ "rstrip": false,
1592
+ "single_word": false,
1593
+ "special": true
1594
+ },
1595
+ "128199": {
1596
+ "content": "<|reserved_special_token_194|>",
1597
+ "lstrip": false,
1598
+ "normalized": false,
1599
+ "rstrip": false,
1600
+ "single_word": false,
1601
+ "special": true
1602
+ },
1603
+ "128200": {
1604
+ "content": "<|reserved_special_token_195|>",
1605
+ "lstrip": false,
1606
+ "normalized": false,
1607
+ "rstrip": false,
1608
+ "single_word": false,
1609
+ "special": true
1610
+ },
1611
+ "128201": {
1612
+ "content": "<|reserved_special_token_196|>",
1613
+ "lstrip": false,
1614
+ "normalized": false,
1615
+ "rstrip": false,
1616
+ "single_word": false,
1617
+ "special": true
1618
+ },
1619
+ "128202": {
1620
+ "content": "<|reserved_special_token_197|>",
1621
+ "lstrip": false,
1622
+ "normalized": false,
1623
+ "rstrip": false,
1624
+ "single_word": false,
1625
+ "special": true
1626
+ },
1627
+ "128203": {
1628
+ "content": "<|reserved_special_token_198|>",
1629
+ "lstrip": false,
1630
+ "normalized": false,
1631
+ "rstrip": false,
1632
+ "single_word": false,
1633
+ "special": true
1634
+ },
1635
+ "128204": {
1636
+ "content": "<|reserved_special_token_199|>",
1637
+ "lstrip": false,
1638
+ "normalized": false,
1639
+ "rstrip": false,
1640
+ "single_word": false,
1641
+ "special": true
1642
+ },
1643
+ "128205": {
1644
+ "content": "<|reserved_special_token_200|>",
1645
+ "lstrip": false,
1646
+ "normalized": false,
1647
+ "rstrip": false,
1648
+ "single_word": false,
1649
+ "special": true
1650
+ },
1651
+ "128206": {
1652
+ "content": "<|reserved_special_token_201|>",
1653
+ "lstrip": false,
1654
+ "normalized": false,
1655
+ "rstrip": false,
1656
+ "single_word": false,
1657
+ "special": true
1658
+ },
1659
+ "128207": {
1660
+ "content": "<|reserved_special_token_202|>",
1661
+ "lstrip": false,
1662
+ "normalized": false,
1663
+ "rstrip": false,
1664
+ "single_word": false,
1665
+ "special": true
1666
+ },
1667
+ "128208": {
1668
+ "content": "<|reserved_special_token_203|>",
1669
+ "lstrip": false,
1670
+ "normalized": false,
1671
+ "rstrip": false,
1672
+ "single_word": false,
1673
+ "special": true
1674
+ },
1675
+ "128209": {
1676
+ "content": "<|reserved_special_token_204|>",
1677
+ "lstrip": false,
1678
+ "normalized": false,
1679
+ "rstrip": false,
1680
+ "single_word": false,
1681
+ "special": true
1682
+ },
1683
+ "128210": {
1684
+ "content": "<|reserved_special_token_205|>",
1685
+ "lstrip": false,
1686
+ "normalized": false,
1687
+ "rstrip": false,
1688
+ "single_word": false,
1689
+ "special": true
1690
+ },
1691
+ "128211": {
1692
+ "content": "<|reserved_special_token_206|>",
1693
+ "lstrip": false,
1694
+ "normalized": false,
1695
+ "rstrip": false,
1696
+ "single_word": false,
1697
+ "special": true
1698
+ },
1699
+ "128212": {
1700
+ "content": "<|reserved_special_token_207|>",
1701
+ "lstrip": false,
1702
+ "normalized": false,
1703
+ "rstrip": false,
1704
+ "single_word": false,
1705
+ "special": true
1706
+ },
1707
+ "128213": {
1708
+ "content": "<|reserved_special_token_208|>",
1709
+ "lstrip": false,
1710
+ "normalized": false,
1711
+ "rstrip": false,
1712
+ "single_word": false,
1713
+ "special": true
1714
+ },
1715
+ "128214": {
1716
+ "content": "<|reserved_special_token_209|>",
1717
+ "lstrip": false,
1718
+ "normalized": false,
1719
+ "rstrip": false,
1720
+ "single_word": false,
1721
+ "special": true
1722
+ },
1723
+ "128215": {
1724
+ "content": "<|reserved_special_token_210|>",
1725
+ "lstrip": false,
1726
+ "normalized": false,
1727
+ "rstrip": false,
1728
+ "single_word": false,
1729
+ "special": true
1730
+ },
1731
+ "128216": {
1732
+ "content": "<|reserved_special_token_211|>",
1733
+ "lstrip": false,
1734
+ "normalized": false,
1735
+ "rstrip": false,
1736
+ "single_word": false,
1737
+ "special": true
1738
+ },
1739
+ "128217": {
1740
+ "content": "<|reserved_special_token_212|>",
1741
+ "lstrip": false,
1742
+ "normalized": false,
1743
+ "rstrip": false,
1744
+ "single_word": false,
1745
+ "special": true
1746
+ },
1747
+ "128218": {
1748
+ "content": "<|reserved_special_token_213|>",
1749
+ "lstrip": false,
1750
+ "normalized": false,
1751
+ "rstrip": false,
1752
+ "single_word": false,
1753
+ "special": true
1754
+ },
1755
+ "128219": {
1756
+ "content": "<|reserved_special_token_214|>",
1757
+ "lstrip": false,
1758
+ "normalized": false,
1759
+ "rstrip": false,
1760
+ "single_word": false,
1761
+ "special": true
1762
+ },
1763
+ "128220": {
1764
+ "content": "<|reserved_special_token_215|>",
1765
+ "lstrip": false,
1766
+ "normalized": false,
1767
+ "rstrip": false,
1768
+ "single_word": false,
1769
+ "special": true
1770
+ },
1771
+ "128221": {
1772
+ "content": "<|reserved_special_token_216|>",
1773
+ "lstrip": false,
1774
+ "normalized": false,
1775
+ "rstrip": false,
1776
+ "single_word": false,
1777
+ "special": true
1778
+ },
1779
+ "128222": {
1780
+ "content": "<|reserved_special_token_217|>",
1781
+ "lstrip": false,
1782
+ "normalized": false,
1783
+ "rstrip": false,
1784
+ "single_word": false,
1785
+ "special": true
1786
+ },
1787
+ "128223": {
1788
+ "content": "<|reserved_special_token_218|>",
1789
+ "lstrip": false,
1790
+ "normalized": false,
1791
+ "rstrip": false,
1792
+ "single_word": false,
1793
+ "special": true
1794
+ },
1795
+ "128224": {
1796
+ "content": "<|reserved_special_token_219|>",
1797
+ "lstrip": false,
1798
+ "normalized": false,
1799
+ "rstrip": false,
1800
+ "single_word": false,
1801
+ "special": true
1802
+ },
1803
+ "128225": {
1804
+ "content": "<|reserved_special_token_220|>",
1805
+ "lstrip": false,
1806
+ "normalized": false,
1807
+ "rstrip": false,
1808
+ "single_word": false,
1809
+ "special": true
1810
+ },
1811
+ "128226": {
1812
+ "content": "<|reserved_special_token_221|>",
1813
+ "lstrip": false,
1814
+ "normalized": false,
1815
+ "rstrip": false,
1816
+ "single_word": false,
1817
+ "special": true
1818
+ },
1819
+ "128227": {
1820
+ "content": "<|reserved_special_token_222|>",
1821
+ "lstrip": false,
1822
+ "normalized": false,
1823
+ "rstrip": false,
1824
+ "single_word": false,
1825
+ "special": true
1826
+ },
1827
+ "128228": {
1828
+ "content": "<|reserved_special_token_223|>",
1829
+ "lstrip": false,
1830
+ "normalized": false,
1831
+ "rstrip": false,
1832
+ "single_word": false,
1833
+ "special": true
1834
+ },
1835
+ "128229": {
1836
+ "content": "<|reserved_special_token_224|>",
1837
+ "lstrip": false,
1838
+ "normalized": false,
1839
+ "rstrip": false,
1840
+ "single_word": false,
1841
+ "special": true
1842
+ },
1843
+ "128230": {
1844
+ "content": "<|reserved_special_token_225|>",
1845
+ "lstrip": false,
1846
+ "normalized": false,
1847
+ "rstrip": false,
1848
+ "single_word": false,
1849
+ "special": true
1850
+ },
1851
+ "128231": {
1852
+ "content": "<|reserved_special_token_226|>",
1853
+ "lstrip": false,
1854
+ "normalized": false,
1855
+ "rstrip": false,
1856
+ "single_word": false,
1857
+ "special": true
1858
+ },
1859
+ "128232": {
1860
+ "content": "<|reserved_special_token_227|>",
1861
+ "lstrip": false,
1862
+ "normalized": false,
1863
+ "rstrip": false,
1864
+ "single_word": false,
1865
+ "special": true
1866
+ },
1867
+ "128233": {
1868
+ "content": "<|reserved_special_token_228|>",
1869
+ "lstrip": false,
1870
+ "normalized": false,
1871
+ "rstrip": false,
1872
+ "single_word": false,
1873
+ "special": true
1874
+ },
1875
+ "128234": {
1876
+ "content": "<|reserved_special_token_229|>",
1877
+ "lstrip": false,
1878
+ "normalized": false,
1879
+ "rstrip": false,
1880
+ "single_word": false,
1881
+ "special": true
1882
+ },
1883
+ "128235": {
1884
+ "content": "<|reserved_special_token_230|>",
1885
+ "lstrip": false,
1886
+ "normalized": false,
1887
+ "rstrip": false,
1888
+ "single_word": false,
1889
+ "special": true
1890
+ },
1891
+ "128236": {
1892
+ "content": "<|reserved_special_token_231|>",
1893
+ "lstrip": false,
1894
+ "normalized": false,
1895
+ "rstrip": false,
1896
+ "single_word": false,
1897
+ "special": true
1898
+ },
1899
+ "128237": {
1900
+ "content": "<|reserved_special_token_232|>",
1901
+ "lstrip": false,
1902
+ "normalized": false,
1903
+ "rstrip": false,
1904
+ "single_word": false,
1905
+ "special": true
1906
+ },
1907
+ "128238": {
1908
+ "content": "<|reserved_special_token_233|>",
1909
+ "lstrip": false,
1910
+ "normalized": false,
1911
+ "rstrip": false,
1912
+ "single_word": false,
1913
+ "special": true
1914
+ },
1915
+ "128239": {
1916
+ "content": "<|reserved_special_token_234|>",
1917
+ "lstrip": false,
1918
+ "normalized": false,
1919
+ "rstrip": false,
1920
+ "single_word": false,
1921
+ "special": true
1922
+ },
1923
+ "128240": {
1924
+ "content": "<|reserved_special_token_235|>",
1925
+ "lstrip": false,
1926
+ "normalized": false,
1927
+ "rstrip": false,
1928
+ "single_word": false,
1929
+ "special": true
1930
+ },
1931
+ "128241": {
1932
+ "content": "<|reserved_special_token_236|>",
1933
+ "lstrip": false,
1934
+ "normalized": false,
1935
+ "rstrip": false,
1936
+ "single_word": false,
1937
+ "special": true
1938
+ },
1939
+ "128242": {
1940
+ "content": "<|reserved_special_token_237|>",
1941
+ "lstrip": false,
1942
+ "normalized": false,
1943
+ "rstrip": false,
1944
+ "single_word": false,
1945
+ "special": true
1946
+ },
1947
+ "128243": {
1948
+ "content": "<|reserved_special_token_238|>",
1949
+ "lstrip": false,
1950
+ "normalized": false,
1951
+ "rstrip": false,
1952
+ "single_word": false,
1953
+ "special": true
1954
+ },
1955
+ "128244": {
1956
+ "content": "<|reserved_special_token_239|>",
1957
+ "lstrip": false,
1958
+ "normalized": false,
1959
+ "rstrip": false,
1960
+ "single_word": false,
1961
+ "special": true
1962
+ },
1963
+ "128245": {
1964
+ "content": "<|reserved_special_token_240|>",
1965
+ "lstrip": false,
1966
+ "normalized": false,
1967
+ "rstrip": false,
1968
+ "single_word": false,
1969
+ "special": true
1970
+ },
1971
+ "128246": {
1972
+ "content": "<|reserved_special_token_241|>",
1973
+ "lstrip": false,
1974
+ "normalized": false,
1975
+ "rstrip": false,
1976
+ "single_word": false,
1977
+ "special": true
1978
+ },
1979
+ "128247": {
1980
+ "content": "<|reserved_special_token_242|>",
1981
+ "lstrip": false,
1982
+ "normalized": false,
1983
+ "rstrip": false,
1984
+ "single_word": false,
1985
+ "special": true
1986
+ },
1987
+ "128248": {
1988
+ "content": "<|reserved_special_token_243|>",
1989
+ "lstrip": false,
1990
+ "normalized": false,
1991
+ "rstrip": false,
1992
+ "single_word": false,
1993
+ "special": true
1994
+ },
1995
+ "128249": {
1996
+ "content": "<|reserved_special_token_244|>",
1997
+ "lstrip": false,
1998
+ "normalized": false,
1999
+ "rstrip": false,
2000
+ "single_word": false,
2001
+ "special": true
2002
+ },
2003
+ "128250": {
2004
+ "content": "<|reserved_special_token_245|>",
2005
+ "lstrip": false,
2006
+ "normalized": false,
2007
+ "rstrip": false,
2008
+ "single_word": false,
2009
+ "special": true
2010
+ },
2011
+ "128251": {
2012
+ "content": "<|reserved_special_token_246|>",
2013
+ "lstrip": false,
2014
+ "normalized": false,
2015
+ "rstrip": false,
2016
+ "single_word": false,
2017
+ "special": true
2018
+ },
2019
+ "128252": {
2020
+ "content": "<|reserved_special_token_247|>",
2021
+ "lstrip": false,
2022
+ "normalized": false,
2023
+ "rstrip": false,
2024
+ "single_word": false,
2025
+ "special": true
2026
+ },
2027
+ "128253": {
2028
+ "content": "<|reserved_special_token_248|>",
2029
+ "lstrip": false,
2030
+ "normalized": false,
2031
+ "rstrip": false,
2032
+ "single_word": false,
2033
+ "special": true
2034
+ },
2035
+ "128254": {
2036
+ "content": "<|reserved_special_token_249|>",
2037
+ "lstrip": false,
2038
+ "normalized": false,
2039
+ "rstrip": false,
2040
+ "single_word": false,
2041
+ "special": true
2042
+ },
2043
+ "128255": {
2044
+ "content": "<|reserved_special_token_250|>",
2045
+ "lstrip": false,
2046
+ "normalized": false,
2047
+ "rstrip": false,
2048
+ "single_word": false,
2049
+ "special": true
2050
+ }
2051
+ },
2052
+ "bos_token": "<|begin_of_text|>",
2053
+ "chat_template": "{% set loop_messages = messages %}{% for message in loop_messages %}{% set content = '<|start_header_id|>' + message['role'] + '<|end_header_id|>\n\n'+ message['content'] | trim + '<|eot_id|>' %}{% if loop.index0 == 0 %}{% set content = bos_token + content %}{% endif %}{{ content }}{% endfor %}{{ '<|start_header_id|>assistant<|end_header_id|>\n\n' }}",
2054
+ "clean_up_tokenization_spaces": true,
2055
+ "eos_token": "<|eot_id|>",
2056
+ "model_input_names": [
2057
+ "input_ids",
2058
+ "attention_mask"
2059
+ ],
2060
+ "model_max_length": 1000000000000000019884624838656,
2061
+ "pad_token": "<|reserved_special_token_0|>",
2062
+ "padding_side": "right",
2063
+ "tokenizer_class": "PreTrainedTokenizerFast"
2064
+ }