[WIP] Upload folder using huggingface_hub (multi-commit 776e029ac2e171ec92097c011f9aab9f738e101019578fcc0ccecd1b3703eda1)

#1
by jayr014 - opened
LICENSE DELETED
@@ -1,176 +0,0 @@
1
-
2
- Apache License
3
- Version 2.0, January 2004
4
- http://www.apache.org/licenses/
5
-
6
- TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
7
-
8
- 1. Definitions.
9
-
10
- "License" shall mean the terms and conditions for use, reproduction,
11
- and distribution as defined by Sections 1 through 9 of this document.
12
-
13
- "Licensor" shall mean the copyright owner or entity authorized by
14
- the copyright owner that is granting the License.
15
-
16
- "Legal Entity" shall mean the union of the acting entity and all
17
- other entities that control, are controlled by, or are under common
18
- control with that entity. For the purposes of this definition,
19
- "control" means (i) the power, direct or indirect, to cause the
20
- direction or management of such entity, whether by contract or
21
- otherwise, or (ii) ownership of fifty percent (50%) or more of the
22
- outstanding shares, or (iii) beneficial ownership of such entity.
23
-
24
- "You" (or "Your") shall mean an individual or Legal Entity
25
- exercising permissions granted by this License.
26
-
27
- "Source" form shall mean the preferred form for making modifications,
28
- including but not limited to software source code, documentation
29
- source, and configuration files.
30
-
31
- "Object" form shall mean any form resulting from mechanical
32
- transformation or translation of a Source form, including but
33
- not limited to compiled object code, generated documentation,
34
- and conversions to other media types.
35
-
36
- "Work" shall mean the work of authorship, whether in Source or
37
- Object form, made available under the License, as indicated by a
38
- copyright notice that is included in or attached to the work
39
- (an example is provided in the Appendix below).
40
-
41
- "Derivative Works" shall mean any work, whether in Source or Object
42
- form, that is based on (or derived from) the Work and for which the
43
- editorial revisions, annotations, elaborations, or other modifications
44
- represent, as a whole, an original work of authorship. For the purposes
45
- of this License, Derivative Works shall not include works that remain
46
- separable from, or merely link (or bind by name) to the interfaces of,
47
- the Work and Derivative Works thereof.
48
-
49
- "Contribution" shall mean any work of authorship, including
50
- the original version of the Work and any modifications or additions
51
- to that Work or Derivative Works thereof, that is intentionally
52
- submitted to Licensor for inclusion in the Work by the copyright owner
53
- or by an individual or Legal Entity authorized to submit on behalf of
54
- the copyright owner. For the purposes of this definition, "submitted"
55
- means any form of electronic, verbal, or written communication sent
56
- to the Licensor or its representatives, including but not limited to
57
- communication on electronic mailing lists, source code control systems,
58
- and issue tracking systems that are managed by, or on behalf of, the
59
- Licensor for the purpose of discussing and improving the Work, but
60
- excluding communication that is conspicuously marked or otherwise
61
- designated in writing by the copyright owner as "Not a Contribution."
62
-
63
- "Contributor" shall mean Licensor and any individual or Legal Entity
64
- on behalf of whom a Contribution has been received by Licensor and
65
- subsequently incorporated within the Work.
66
-
67
- 2. Grant of Copyright License. Subject to the terms and conditions of
68
- this License, each Contributor hereby grants to You a perpetual,
69
- worldwide, non-exclusive, no-charge, royalty-free, irrevocable
70
- copyright license to reproduce, prepare Derivative Works of,
71
- publicly display, publicly perform, sublicense, and distribute the
72
- Work and such Derivative Works in Source or Object form.
73
-
74
- 3. Grant of Patent License. Subject to the terms and conditions of
75
- this License, each Contributor hereby grants to You a perpetual,
76
- worldwide, non-exclusive, no-charge, royalty-free, irrevocable
77
- (except as stated in this section) patent license to make, have made,
78
- use, offer to sell, sell, import, and otherwise transfer the Work,
79
- where such license applies only to those patent claims licensable
80
- by such Contributor that are necessarily infringed by their
81
- Contribution(s) alone or by combination of their Contribution(s)
82
- with the Work to which such Contribution(s) was submitted. If You
83
- institute patent litigation against any entity (including a
84
- cross-claim or counterclaim in a lawsuit) alleging that the Work
85
- or a Contribution incorporated within the Work constitutes direct
86
- or contributory patent infringement, then any patent licenses
87
- granted to You under this License for that Work shall terminate
88
- as of the date such litigation is filed.
89
-
90
- 4. Redistribution. You may reproduce and distribute copies of the
91
- Work or Derivative Works thereof in any medium, with or without
92
- modifications, and in Source or Object form, provided that You
93
- meet the following conditions:
94
-
95
- (a) You must give any other recipients of the Work or
96
- Derivative Works a copy of this License; and
97
-
98
- (b) You must cause any modified files to carry prominent notices
99
- stating that You changed the files; and
100
-
101
- (c) You must retain, in the Source form of any Derivative Works
102
- that You distribute, all copyright, patent, trademark, and
103
- attribution notices from the Source form of the Work,
104
- excluding those notices that do not pertain to any part of
105
- the Derivative Works; and
106
-
107
- (d) If the Work includes a "NOTICE" text file as part of its
108
- distribution, then any Derivative Works that You distribute must
109
- include a readable copy of the attribution notices contained
110
- within such NOTICE file, excluding those notices that do not
111
- pertain to any part of the Derivative Works, in at least one
112
- of the following places: within a NOTICE text file distributed
113
- as part of the Derivative Works; within the Source form or
114
- documentation, if provided along with the Derivative Works; or,
115
- within a display generated by the Derivative Works, if and
116
- wherever such third-party notices normally appear. The contents
117
- of the NOTICE file are for informational purposes only and
118
- do not modify the License. You may add Your own attribution
119
- notices within Derivative Works that You distribute, alongside
120
- or as an addendum to the NOTICE text from the Work, provided
121
- that such additional attribution notices cannot be construed
122
- as modifying the License.
123
-
124
- You may add Your own copyright statement to Your modifications and
125
- may provide additional or different license terms and conditions
126
- for use, reproduction, or distribution of Your modifications, or
127
- for any such Derivative Works as a whole, provided Your use,
128
- reproduction, and distribution of the Work otherwise complies with
129
- the conditions stated in this License.
130
-
131
- 5. Submission of Contributions. Unless You explicitly state otherwise,
132
- any Contribution intentionally submitted for inclusion in the Work
133
- by You to the Licensor shall be under the terms and conditions of
134
- this License, without any additional terms or conditions.
135
- Notwithstanding the above, nothing herein shall supersede or modify
136
- the terms of any separate license agreement you may have executed
137
- with Licensor regarding such Contributions.
138
-
139
- 6. Trademarks. This License does not grant permission to use the trade
140
- names, trademarks, service marks, or product names of the Licensor,
141
- except as required for reasonable and customary use in describing the
142
- origin of the Work and reproducing the content of the NOTICE file.
143
-
144
- 7. Disclaimer of Warranty. Unless required by applicable law or
145
- agreed to in writing, Licensor provides the Work (and each
146
- Contributor provides its Contributions) on an "AS IS" BASIS,
147
- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
148
- implied, including, without limitation, any warranties or conditions
149
- of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
150
- PARTICULAR PURPOSE. You are solely responsible for determining the
151
- appropriateness of using or redistributing the Work and assume any
152
- risks associated with Your exercise of permissions under this License.
153
-
154
- 8. Limitation of Liability. In no event and under no legal theory,
155
- whether in tort (including negligence), contract, or otherwise,
156
- unless required by applicable law (such as deliberate and grossly
157
- negligent acts) or agreed to in writing, shall any Contributor be
158
- liable to You for damages, including any direct, indirect, special,
159
- incidental, or consequential damages of any character arising as a
160
- result of this License or out of the use or inability to use the
161
- Work (including but not limited to damages for loss of goodwill,
162
- work stoppage, computer failure or malfunction, or any and all
163
- other commercial damages or losses), even if such Contributor
164
- has been advised of the possibility of such damages.
165
-
166
- 9. Accepting Warranty or Additional Liability. While redistributing
167
- the Work or Derivative Works thereof, You may choose to offer,
168
- and charge a fee for, acceptance of support, warranty, indemnity,
169
- or other liability obligations and/or rights consistent with this
170
- License. However, in accepting such obligations, You may act only
171
- on Your own behalf and on Your sole responsibility, not on behalf
172
- of any other Contributor, and only if You agree to indemnify,
173
- defend, and hold each Contributor harmless for any liability
174
- incurred by, or claims asserted against, such Contributor by reason
175
- of your accepting any such warranty or additional liability.
176
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
README.md DELETED
@@ -1,167 +0,0 @@
1
- ---
2
- # For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1
3
- # Doc / guide: https://huggingface.co/docs/hub/model-cards
4
- license: apache-2.0
5
- inference: false
6
- ---
7
-
8
- # SN-13B-8k-Instruct
9
-
10
- <!-- Provide a quick summary of what the model is/does. -->
11
-
12
- SN-13B-8k-Instruct is a 13 billion parameter model. It was pretrained as well as instruction tuned on
13
- [SambaNova DataScale systems](https://sambanova.ai/products/datascale/). This model is meant to be used for tasks requiring long sequence understanding.
14
-
15
- ## Model Details
16
-
17
- ### Model Description
18
-
19
- <!-- Provide a longer summary of what this model is. -->
20
-
21
- - **Developed by:** [SambaNova Systems](https://sambanova.ai/)
22
- - **Model type:** Language Model
23
- - **Language(s):** English
24
- - **License:** Apache 2.0
25
-
26
- ### Basic Information
27
-
28
- <!-- Provide the basic links for the model. -->
29
- - **Blog Post**: [Link](https://sambanova.ai/blog/training-long-sequence-size-models-on-sambanova/)
30
- - **Discord**: [Link](https://discord.com/invite/8z2Pe7cpRv)
31
- <!-- - **Github**: [Link](https://github.com/sambanova/bloomchat) -->
32
-
33
- ### Licensing
34
-
35
- To increase accessibility and to support the open-source community, SambaNova is releasing SN-13B-8k-Instruct under an Apache 2.0 license. [Please review SambaNova’s SN-13B-8k-Instruct License](LICENSE)
36
-
37
- ## Uses
38
- <details>
39
- <summary>Click to expand</summary>
40
- <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
41
-
42
- ### Direct Use
43
-
44
- <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
45
- This model is intended for commercial and research use.
46
-
47
-
48
- ### Out-of-Scope Use
49
-
50
- <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
51
-
52
-
53
- SN-13B-8k-Instruct should NOT be used for:
54
-
55
- - Mission-critical applications
56
- - Applications that involve the safety of others
57
- - Making highly important decisions
58
- - Important automated pipelines
59
-
60
- This model is still in early development and can be prone to mistakes and hallucinations, there is still room for improvement. This model is intended to provide the community with a multilingual chat LLM baseline.
61
-
62
- ### Recommendations
63
-
64
- <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
65
-
66
- Users should be made aware of the risks, biases, limitations, and restrictions of the model, which are listed down at the bottom of the page.
67
-
68
- </details>
69
-
70
-
71
- ---
72
- ## Running the model
73
-
74
- ```python
75
- from transformers import AutoModelForCausalLM, AutoTokenizer
76
-
77
- tokenizer = AutoTokenizer.from_pretrained("sambanovasystems/SN-13B-8k-Instruct")
78
- model = AutoModelForCausalLM.from_pretrained("sambanovasystems/SN-13B-8k-Instruct")
79
-
80
- prompt = 'Define Machine Learning.'
81
- inputs = tokenizer(prompt, return_tensors='pt')
82
-
83
- # SN-13B-8k-Instruct occasionally repeats itself when do_sample=False.
84
- # Set do_sample=True when using the model to avoid this.
85
- outputs = model.generate(**inputs, use_cache=True, max_new_tokens=50, do_sample=False)
86
-
87
- print(tokenizer.batch_decode(outputs))
88
- ```
89
-
90
- ---
91
-
92
- ## Training Details
93
-
94
- <details>
95
- <summary>Click to expand</summary>
96
-
97
- ### Training Procedure
98
-
99
- <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
100
-
101
- We trained SN-13B-8k-Instruct with [SambaNova DataScale systems](https://sambanova.ai/products/datascale/) with
102
- SambaNova's in-house Reconfigurable Dataflow Unit (RDU). We started from random weights, and pretrained for 300 Billion
103
- tokens on sequences of size 2048. We then pretrained for another 250 Billion tokens on sequences of size 8192.
104
- During this phase of training, we curated a dataset that had a large proportion of long sequence articles, with
105
- 30% of our articles consisting of greater than 6000 words.
106
-
107
- We applied instruction tuning on a variety of tasks derived from datasets such as FLANv2, P3, Natural Instructions, etc.
108
-
109
- ### Hyperparameters
110
-
111
- **Pretraining on 8k SS**
112
-
113
- - Hardware: SambaNova Reconfigurable Dataflow Unit (RDU)
114
- - Optimizer: AdamW
115
- - Steps: 60000
116
- - Global Batch size: 1024
117
- - Learning Rate: 1e-5
118
- - Learning Rate Scheduler: Fixed
119
- - Warmup Steps: 0
120
- - Weight decay: 0.1
121
-
122
- **Instruction-tuned Training**
123
-
124
- - Hardware: SambaNova Reconfigurable Dataflow Unit (RDU)
125
- - Optimizer: AdamW
126
- - Steps: 35000
127
- - Global Batch size: 64
128
- - Learning Rate: 1e-5
129
- - Learning Rate Scheduler: Fixed
130
- - Warmup Steps: 0
131
- - Weight decay: 0.1
132
-
133
- </details>
134
-
135
- ---
136
-
137
- ## Bias, Risks, and Limitations
138
-
139
- <!-- This section is meant to convey both technical and sociotechnical limitations. -->
140
-
141
- Like all LLMs, SN-13B-8k-Instruct has certain limitations:
142
- - Hallucination: SN-13B-8k-Instruct may sometimes generate responses that contain plausible-sounding but factually incorrect or irrelevant information.
143
- - Repetition: SN-13B-8k-Instruct may produce repetitive phrases or sentences, leading to less engaging and informative responses.
144
- - Coding and Math: The model's performance in generating accurate code or solving complex mathematical problems may be limited.
145
- - Toxicity: SN-13B-8k-Instruct may inadvertently generate responses containing inappropriate or harmful content.
146
-
147
- ## Acknowledgment
148
-
149
- We appreciate [Scrolls](https://www.scrolls-benchmark.com/) and [ZeroScrolls](https://www.zero.scrolls-benchmark.com/) for their contributions in creating effective benchmarks to test the long sequence understanding of Large Language Models.
150
- We appreciate [lm-eval-harness](https://github.com/EleutherAI/lm-evaluation-harness) and [HELM](https://crfm.stanford.edu/helm/latest/) for their essential benchmarking contributions,
151
- which were both very helpful in evaluating SN-13B-8k-Instruct's performance. We appreciate the inspiration from the wave of various recent open-source long sequence models,
152
- including [XGen](https://blog.salesforceairesearch.com/xgen/), [MPT](https://www.mosaicml.com/blog/long-context-mpt-7b-8k), and
153
- [Llama-2](https://ai.meta.com/llama/) and so on. We look forward to witnessing the continued growth and success of open-source long sequence models.
154
-
155
- We highly appreciate the hard work and dedication of these researchers and organizations towards the advancement of the open-source community. Their contributions were invaluable in the development of SN-13B-8k-Instruct, and we hope that our model can contribute to further advancements in the field.
156
-
157
- ## Cite SN-13B-8k-Instruct
158
- ```
159
- @software{sn-13b-8k-instruct,
160
- title = {SN-13B-8k-Instruct: training long sequence size models with SambaNova},
161
- author = {SambaNova Systems},
162
- url = {https://huggingface.co/sambanovasystems/SN-13B-8k-Instruct}
163
- month = {8},
164
- year = {2023},
165
- version = {1.0},
166
- }
167
- ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
config.json DELETED
@@ -1,42 +0,0 @@
1
- {
2
- "_name_or_path": "/scratch1/jayr/gpt13b_itv3.5",
3
- "activation_function": "gelu_new",
4
- "architectures": [
5
- "GPT2Model"
6
- ],
7
- "attn_pdrop": 0.1,
8
- "bos_token_id": 50256,
9
- "embd_pdrop": 0.0,
10
- "eos_token_id": 50256,
11
- "gradient_checkpointing": false,
12
- "initializer_range": 0.02,
13
- "large_bs_inference": false,
14
- "layer_norm_epsilon": 1e-05,
15
- "model_type": "gpt2",
16
- "n_ctx": 8192,
17
- "n_embd": 5120,
18
- "n_head": 40,
19
- "n_inner": null,
20
- "n_layer": 40,
21
- "n_positions": 8192,
22
- "reorder_and_upcast_attn": false,
23
- "resid_pdrop": 0.1,
24
- "return_dict": false,
25
- "scale_attn_by_inverse_layer_idx": false,
26
- "scale_attn_weights": true,
27
- "summary_activation": null,
28
- "summary_first_dropout": 0.1,
29
- "summary_proj_to_labels": true,
30
- "summary_type": "cls_index",
31
- "summary_use_proj": true,
32
- "task_specific_params": {
33
- "text-generation": {
34
- "do_sample": true,
35
- "max_length": 50
36
- }
37
- },
38
- "torch_dtype": "float32",
39
- "transformers_version": "4.18.0",
40
- "use_cache": true,
41
- "vocab_size": 50260
42
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
merges.txt DELETED
The diff for this file is too large to render. See raw diff
 
pytorch_model-00001-of-00006.bin DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:1bfa526df470ca695e792a56cdbd89c41c023768661a763fea6a593ca59277f1
3
- size 9637826281
 
 
 
 
pytorch_model-00002-of-00006.bin DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:f39b4b18ae8677a75bcf76fef72ad3302a65d83bddbb2a7875dcb052d2d01c30
3
- size 9699205981
 
 
 
 
pytorch_model-00003-of-00006.bin DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:bc656acbae27131d84153e8daada6a446d0716728d1a1c2ad083e0bc0156d738
3
- size 9766295653
 
 
 
 
pytorch_model-00004-of-00006.bin DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:1f38d0df96c2f08a1f31c5bbe31adde0029e9e71519097dd39340204c8140e6a
3
- size 9699248065
 
 
 
 
pytorch_model-00005-of-00006.bin DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:3cb7ce956c0c1480fbe92c1e43199ca5d2586d5813c030e837c031b22e3e2537
3
- size 9699206045
 
 
 
 
pytorch_model-00006-of-00006.bin DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:ed7ca6af6180c9923d18c69eb68353948455d7546bb5419eaf30d3b1aba1dbe5
3
- size 5722174405
 
 
 
 
pytorch_model.bin.index.json DELETED
@@ -1,571 +0,0 @@
1
- {
2
- "metadata": {
3
- "total_size": 54223790240
4
- },
5
- "weight_map": {
6
- "h.0.attn.bias": "pytorch_model-00001-of-00006.bin",
7
- "h.0.attn.c_attn.bias": "pytorch_model-00001-of-00006.bin",
8
- "h.0.attn.c_attn.weight": "pytorch_model-00001-of-00006.bin",
9
- "h.0.attn.c_proj.bias": "pytorch_model-00001-of-00006.bin",
10
- "h.0.attn.c_proj.weight": "pytorch_model-00001-of-00006.bin",
11
- "h.0.attn.masked_bias": "pytorch_model-00001-of-00006.bin",
12
- "h.0.ln_1.bias": "pytorch_model-00001-of-00006.bin",
13
- "h.0.ln_1.weight": "pytorch_model-00001-of-00006.bin",
14
- "h.0.ln_2.bias": "pytorch_model-00001-of-00006.bin",
15
- "h.0.ln_2.weight": "pytorch_model-00001-of-00006.bin",
16
- "h.0.mlp.c_fc.bias": "pytorch_model-00001-of-00006.bin",
17
- "h.0.mlp.c_fc.weight": "pytorch_model-00001-of-00006.bin",
18
- "h.0.mlp.c_proj.bias": "pytorch_model-00001-of-00006.bin",
19
- "h.0.mlp.c_proj.weight": "pytorch_model-00001-of-00006.bin",
20
- "h.1.attn.bias": "pytorch_model-00001-of-00006.bin",
21
- "h.1.attn.c_attn.bias": "pytorch_model-00001-of-00006.bin",
22
- "h.1.attn.c_attn.weight": "pytorch_model-00001-of-00006.bin",
23
- "h.1.attn.c_proj.bias": "pytorch_model-00001-of-00006.bin",
24
- "h.1.attn.c_proj.weight": "pytorch_model-00001-of-00006.bin",
25
- "h.1.attn.masked_bias": "pytorch_model-00001-of-00006.bin",
26
- "h.1.ln_1.bias": "pytorch_model-00001-of-00006.bin",
27
- "h.1.ln_1.weight": "pytorch_model-00001-of-00006.bin",
28
- "h.1.ln_2.bias": "pytorch_model-00001-of-00006.bin",
29
- "h.1.ln_2.weight": "pytorch_model-00001-of-00006.bin",
30
- "h.1.mlp.c_fc.bias": "pytorch_model-00001-of-00006.bin",
31
- "h.1.mlp.c_fc.weight": "pytorch_model-00001-of-00006.bin",
32
- "h.1.mlp.c_proj.bias": "pytorch_model-00001-of-00006.bin",
33
- "h.1.mlp.c_proj.weight": "pytorch_model-00001-of-00006.bin",
34
- "h.10.attn.bias": "pytorch_model-00002-of-00006.bin",
35
- "h.10.attn.c_attn.bias": "pytorch_model-00002-of-00006.bin",
36
- "h.10.attn.c_attn.weight": "pytorch_model-00002-of-00006.bin",
37
- "h.10.attn.c_proj.bias": "pytorch_model-00002-of-00006.bin",
38
- "h.10.attn.c_proj.weight": "pytorch_model-00002-of-00006.bin",
39
- "h.10.attn.masked_bias": "pytorch_model-00002-of-00006.bin",
40
- "h.10.ln_1.bias": "pytorch_model-00002-of-00006.bin",
41
- "h.10.ln_1.weight": "pytorch_model-00002-of-00006.bin",
42
- "h.10.ln_2.bias": "pytorch_model-00002-of-00006.bin",
43
- "h.10.ln_2.weight": "pytorch_model-00002-of-00006.bin",
44
- "h.10.mlp.c_fc.bias": "pytorch_model-00002-of-00006.bin",
45
- "h.10.mlp.c_fc.weight": "pytorch_model-00002-of-00006.bin",
46
- "h.10.mlp.c_proj.bias": "pytorch_model-00002-of-00006.bin",
47
- "h.10.mlp.c_proj.weight": "pytorch_model-00002-of-00006.bin",
48
- "h.11.attn.bias": "pytorch_model-00002-of-00006.bin",
49
- "h.11.attn.c_attn.bias": "pytorch_model-00002-of-00006.bin",
50
- "h.11.attn.c_attn.weight": "pytorch_model-00002-of-00006.bin",
51
- "h.11.attn.c_proj.bias": "pytorch_model-00002-of-00006.bin",
52
- "h.11.attn.c_proj.weight": "pytorch_model-00002-of-00006.bin",
53
- "h.11.attn.masked_bias": "pytorch_model-00002-of-00006.bin",
54
- "h.11.ln_1.bias": "pytorch_model-00002-of-00006.bin",
55
- "h.11.ln_1.weight": "pytorch_model-00002-of-00006.bin",
56
- "h.11.ln_2.bias": "pytorch_model-00002-of-00006.bin",
57
- "h.11.ln_2.weight": "pytorch_model-00002-of-00006.bin",
58
- "h.11.mlp.c_fc.bias": "pytorch_model-00002-of-00006.bin",
59
- "h.11.mlp.c_fc.weight": "pytorch_model-00002-of-00006.bin",
60
- "h.11.mlp.c_proj.bias": "pytorch_model-00002-of-00006.bin",
61
- "h.11.mlp.c_proj.weight": "pytorch_model-00002-of-00006.bin",
62
- "h.12.attn.bias": "pytorch_model-00002-of-00006.bin",
63
- "h.12.attn.c_attn.bias": "pytorch_model-00002-of-00006.bin",
64
- "h.12.attn.c_attn.weight": "pytorch_model-00002-of-00006.bin",
65
- "h.12.attn.c_proj.bias": "pytorch_model-00002-of-00006.bin",
66
- "h.12.attn.c_proj.weight": "pytorch_model-00002-of-00006.bin",
67
- "h.12.attn.masked_bias": "pytorch_model-00002-of-00006.bin",
68
- "h.12.ln_1.bias": "pytorch_model-00002-of-00006.bin",
69
- "h.12.ln_1.weight": "pytorch_model-00002-of-00006.bin",
70
- "h.12.ln_2.bias": "pytorch_model-00002-of-00006.bin",
71
- "h.12.ln_2.weight": "pytorch_model-00002-of-00006.bin",
72
- "h.12.mlp.c_fc.bias": "pytorch_model-00002-of-00006.bin",
73
- "h.12.mlp.c_fc.weight": "pytorch_model-00002-of-00006.bin",
74
- "h.12.mlp.c_proj.bias": "pytorch_model-00002-of-00006.bin",
75
- "h.12.mlp.c_proj.weight": "pytorch_model-00002-of-00006.bin",
76
- "h.13.attn.bias": "pytorch_model-00002-of-00006.bin",
77
- "h.13.attn.c_attn.bias": "pytorch_model-00002-of-00006.bin",
78
- "h.13.attn.c_attn.weight": "pytorch_model-00002-of-00006.bin",
79
- "h.13.attn.c_proj.bias": "pytorch_model-00002-of-00006.bin",
80
- "h.13.attn.c_proj.weight": "pytorch_model-00002-of-00006.bin",
81
- "h.13.attn.masked_bias": "pytorch_model-00002-of-00006.bin",
82
- "h.13.ln_1.bias": "pytorch_model-00002-of-00006.bin",
83
- "h.13.ln_1.weight": "pytorch_model-00002-of-00006.bin",
84
- "h.13.ln_2.bias": "pytorch_model-00002-of-00006.bin",
85
- "h.13.ln_2.weight": "pytorch_model-00002-of-00006.bin",
86
- "h.13.mlp.c_fc.bias": "pytorch_model-00002-of-00006.bin",
87
- "h.13.mlp.c_fc.weight": "pytorch_model-00002-of-00006.bin",
88
- "h.13.mlp.c_proj.bias": "pytorch_model-00003-of-00006.bin",
89
- "h.13.mlp.c_proj.weight": "pytorch_model-00003-of-00006.bin",
90
- "h.14.attn.bias": "pytorch_model-00003-of-00006.bin",
91
- "h.14.attn.c_attn.bias": "pytorch_model-00003-of-00006.bin",
92
- "h.14.attn.c_attn.weight": "pytorch_model-00003-of-00006.bin",
93
- "h.14.attn.c_proj.bias": "pytorch_model-00003-of-00006.bin",
94
- "h.14.attn.c_proj.weight": "pytorch_model-00003-of-00006.bin",
95
- "h.14.attn.masked_bias": "pytorch_model-00003-of-00006.bin",
96
- "h.14.ln_1.bias": "pytorch_model-00003-of-00006.bin",
97
- "h.14.ln_1.weight": "pytorch_model-00003-of-00006.bin",
98
- "h.14.ln_2.bias": "pytorch_model-00003-of-00006.bin",
99
- "h.14.ln_2.weight": "pytorch_model-00003-of-00006.bin",
100
- "h.14.mlp.c_fc.bias": "pytorch_model-00003-of-00006.bin",
101
- "h.14.mlp.c_fc.weight": "pytorch_model-00003-of-00006.bin",
102
- "h.14.mlp.c_proj.bias": "pytorch_model-00003-of-00006.bin",
103
- "h.14.mlp.c_proj.weight": "pytorch_model-00003-of-00006.bin",
104
- "h.15.attn.bias": "pytorch_model-00003-of-00006.bin",
105
- "h.15.attn.c_attn.bias": "pytorch_model-00003-of-00006.bin",
106
- "h.15.attn.c_attn.weight": "pytorch_model-00003-of-00006.bin",
107
- "h.15.attn.c_proj.bias": "pytorch_model-00003-of-00006.bin",
108
- "h.15.attn.c_proj.weight": "pytorch_model-00003-of-00006.bin",
109
- "h.15.attn.masked_bias": "pytorch_model-00003-of-00006.bin",
110
- "h.15.ln_1.bias": "pytorch_model-00003-of-00006.bin",
111
- "h.15.ln_1.weight": "pytorch_model-00003-of-00006.bin",
112
- "h.15.ln_2.bias": "pytorch_model-00003-of-00006.bin",
113
- "h.15.ln_2.weight": "pytorch_model-00003-of-00006.bin",
114
- "h.15.mlp.c_fc.bias": "pytorch_model-00003-of-00006.bin",
115
- "h.15.mlp.c_fc.weight": "pytorch_model-00003-of-00006.bin",
116
- "h.15.mlp.c_proj.bias": "pytorch_model-00003-of-00006.bin",
117
- "h.15.mlp.c_proj.weight": "pytorch_model-00003-of-00006.bin",
118
- "h.16.attn.bias": "pytorch_model-00003-of-00006.bin",
119
- "h.16.attn.c_attn.bias": "pytorch_model-00003-of-00006.bin",
120
- "h.16.attn.c_attn.weight": "pytorch_model-00003-of-00006.bin",
121
- "h.16.attn.c_proj.bias": "pytorch_model-00003-of-00006.bin",
122
- "h.16.attn.c_proj.weight": "pytorch_model-00003-of-00006.bin",
123
- "h.16.attn.masked_bias": "pytorch_model-00003-of-00006.bin",
124
- "h.16.ln_1.bias": "pytorch_model-00003-of-00006.bin",
125
- "h.16.ln_1.weight": "pytorch_model-00003-of-00006.bin",
126
- "h.16.ln_2.bias": "pytorch_model-00003-of-00006.bin",
127
- "h.16.ln_2.weight": "pytorch_model-00003-of-00006.bin",
128
- "h.16.mlp.c_fc.bias": "pytorch_model-00003-of-00006.bin",
129
- "h.16.mlp.c_fc.weight": "pytorch_model-00003-of-00006.bin",
130
- "h.16.mlp.c_proj.bias": "pytorch_model-00003-of-00006.bin",
131
- "h.16.mlp.c_proj.weight": "pytorch_model-00003-of-00006.bin",
132
- "h.17.attn.bias": "pytorch_model-00003-of-00006.bin",
133
- "h.17.attn.c_attn.bias": "pytorch_model-00003-of-00006.bin",
134
- "h.17.attn.c_attn.weight": "pytorch_model-00003-of-00006.bin",
135
- "h.17.attn.c_proj.bias": "pytorch_model-00003-of-00006.bin",
136
- "h.17.attn.c_proj.weight": "pytorch_model-00003-of-00006.bin",
137
- "h.17.attn.masked_bias": "pytorch_model-00003-of-00006.bin",
138
- "h.17.ln_1.bias": "pytorch_model-00003-of-00006.bin",
139
- "h.17.ln_1.weight": "pytorch_model-00003-of-00006.bin",
140
- "h.17.ln_2.bias": "pytorch_model-00003-of-00006.bin",
141
- "h.17.ln_2.weight": "pytorch_model-00003-of-00006.bin",
142
- "h.17.mlp.c_fc.bias": "pytorch_model-00003-of-00006.bin",
143
- "h.17.mlp.c_fc.weight": "pytorch_model-00003-of-00006.bin",
144
- "h.17.mlp.c_proj.bias": "pytorch_model-00003-of-00006.bin",
145
- "h.17.mlp.c_proj.weight": "pytorch_model-00003-of-00006.bin",
146
- "h.18.attn.bias": "pytorch_model-00003-of-00006.bin",
147
- "h.18.attn.c_attn.bias": "pytorch_model-00003-of-00006.bin",
148
- "h.18.attn.c_attn.weight": "pytorch_model-00003-of-00006.bin",
149
- "h.18.attn.c_proj.bias": "pytorch_model-00003-of-00006.bin",
150
- "h.18.attn.c_proj.weight": "pytorch_model-00003-of-00006.bin",
151
- "h.18.attn.masked_bias": "pytorch_model-00003-of-00006.bin",
152
- "h.18.ln_1.bias": "pytorch_model-00003-of-00006.bin",
153
- "h.18.ln_1.weight": "pytorch_model-00003-of-00006.bin",
154
- "h.18.ln_2.bias": "pytorch_model-00003-of-00006.bin",
155
- "h.18.ln_2.weight": "pytorch_model-00003-of-00006.bin",
156
- "h.18.mlp.c_fc.bias": "pytorch_model-00003-of-00006.bin",
157
- "h.18.mlp.c_fc.weight": "pytorch_model-00003-of-00006.bin",
158
- "h.18.mlp.c_proj.bias": "pytorch_model-00003-of-00006.bin",
159
- "h.18.mlp.c_proj.weight": "pytorch_model-00003-of-00006.bin",
160
- "h.19.attn.bias": "pytorch_model-00003-of-00006.bin",
161
- "h.19.attn.c_attn.bias": "pytorch_model-00003-of-00006.bin",
162
- "h.19.attn.c_attn.weight": "pytorch_model-00003-of-00006.bin",
163
- "h.19.attn.c_proj.bias": "pytorch_model-00003-of-00006.bin",
164
- "h.19.attn.c_proj.weight": "pytorch_model-00003-of-00006.bin",
165
- "h.19.attn.masked_bias": "pytorch_model-00003-of-00006.bin",
166
- "h.19.ln_1.bias": "pytorch_model-00003-of-00006.bin",
167
- "h.19.ln_1.weight": "pytorch_model-00003-of-00006.bin",
168
- "h.19.ln_2.bias": "pytorch_model-00003-of-00006.bin",
169
- "h.19.ln_2.weight": "pytorch_model-00003-of-00006.bin",
170
- "h.19.mlp.c_fc.bias": "pytorch_model-00003-of-00006.bin",
171
- "h.19.mlp.c_fc.weight": "pytorch_model-00003-of-00006.bin",
172
- "h.19.mlp.c_proj.bias": "pytorch_model-00003-of-00006.bin",
173
- "h.19.mlp.c_proj.weight": "pytorch_model-00003-of-00006.bin",
174
- "h.2.attn.bias": "pytorch_model-00001-of-00006.bin",
175
- "h.2.attn.c_attn.bias": "pytorch_model-00001-of-00006.bin",
176
- "h.2.attn.c_attn.weight": "pytorch_model-00001-of-00006.bin",
177
- "h.2.attn.c_proj.bias": "pytorch_model-00001-of-00006.bin",
178
- "h.2.attn.c_proj.weight": "pytorch_model-00001-of-00006.bin",
179
- "h.2.attn.masked_bias": "pytorch_model-00001-of-00006.bin",
180
- "h.2.ln_1.bias": "pytorch_model-00001-of-00006.bin",
181
- "h.2.ln_1.weight": "pytorch_model-00001-of-00006.bin",
182
- "h.2.ln_2.bias": "pytorch_model-00001-of-00006.bin",
183
- "h.2.ln_2.weight": "pytorch_model-00001-of-00006.bin",
184
- "h.2.mlp.c_fc.bias": "pytorch_model-00001-of-00006.bin",
185
- "h.2.mlp.c_fc.weight": "pytorch_model-00001-of-00006.bin",
186
- "h.2.mlp.c_proj.bias": "pytorch_model-00001-of-00006.bin",
187
- "h.2.mlp.c_proj.weight": "pytorch_model-00001-of-00006.bin",
188
- "h.20.attn.bias": "pytorch_model-00003-of-00006.bin",
189
- "h.20.attn.c_attn.bias": "pytorch_model-00003-of-00006.bin",
190
- "h.20.attn.c_attn.weight": "pytorch_model-00003-of-00006.bin",
191
- "h.20.attn.c_proj.bias": "pytorch_model-00003-of-00006.bin",
192
- "h.20.attn.c_proj.weight": "pytorch_model-00003-of-00006.bin",
193
- "h.20.attn.masked_bias": "pytorch_model-00003-of-00006.bin",
194
- "h.20.ln_1.bias": "pytorch_model-00003-of-00006.bin",
195
- "h.20.ln_1.weight": "pytorch_model-00003-of-00006.bin",
196
- "h.20.ln_2.bias": "pytorch_model-00003-of-00006.bin",
197
- "h.20.ln_2.weight": "pytorch_model-00003-of-00006.bin",
198
- "h.20.mlp.c_fc.bias": "pytorch_model-00003-of-00006.bin",
199
- "h.20.mlp.c_fc.weight": "pytorch_model-00003-of-00006.bin",
200
- "h.20.mlp.c_proj.bias": "pytorch_model-00003-of-00006.bin",
201
- "h.20.mlp.c_proj.weight": "pytorch_model-00003-of-00006.bin",
202
- "h.21.attn.bias": "pytorch_model-00003-of-00006.bin",
203
- "h.21.attn.c_attn.bias": "pytorch_model-00004-of-00006.bin",
204
- "h.21.attn.c_attn.weight": "pytorch_model-00004-of-00006.bin",
205
- "h.21.attn.c_proj.bias": "pytorch_model-00004-of-00006.bin",
206
- "h.21.attn.c_proj.weight": "pytorch_model-00004-of-00006.bin",
207
- "h.21.attn.masked_bias": "pytorch_model-00003-of-00006.bin",
208
- "h.21.ln_1.bias": "pytorch_model-00003-of-00006.bin",
209
- "h.21.ln_1.weight": "pytorch_model-00003-of-00006.bin",
210
- "h.21.ln_2.bias": "pytorch_model-00004-of-00006.bin",
211
- "h.21.ln_2.weight": "pytorch_model-00004-of-00006.bin",
212
- "h.21.mlp.c_fc.bias": "pytorch_model-00004-of-00006.bin",
213
- "h.21.mlp.c_fc.weight": "pytorch_model-00004-of-00006.bin",
214
- "h.21.mlp.c_proj.bias": "pytorch_model-00004-of-00006.bin",
215
- "h.21.mlp.c_proj.weight": "pytorch_model-00004-of-00006.bin",
216
- "h.22.attn.bias": "pytorch_model-00004-of-00006.bin",
217
- "h.22.attn.c_attn.bias": "pytorch_model-00004-of-00006.bin",
218
- "h.22.attn.c_attn.weight": "pytorch_model-00004-of-00006.bin",
219
- "h.22.attn.c_proj.bias": "pytorch_model-00004-of-00006.bin",
220
- "h.22.attn.c_proj.weight": "pytorch_model-00004-of-00006.bin",
221
- "h.22.attn.masked_bias": "pytorch_model-00004-of-00006.bin",
222
- "h.22.ln_1.bias": "pytorch_model-00004-of-00006.bin",
223
- "h.22.ln_1.weight": "pytorch_model-00004-of-00006.bin",
224
- "h.22.ln_2.bias": "pytorch_model-00004-of-00006.bin",
225
- "h.22.ln_2.weight": "pytorch_model-00004-of-00006.bin",
226
- "h.22.mlp.c_fc.bias": "pytorch_model-00004-of-00006.bin",
227
- "h.22.mlp.c_fc.weight": "pytorch_model-00004-of-00006.bin",
228
- "h.22.mlp.c_proj.bias": "pytorch_model-00004-of-00006.bin",
229
- "h.22.mlp.c_proj.weight": "pytorch_model-00004-of-00006.bin",
230
- "h.23.attn.bias": "pytorch_model-00004-of-00006.bin",
231
- "h.23.attn.c_attn.bias": "pytorch_model-00004-of-00006.bin",
232
- "h.23.attn.c_attn.weight": "pytorch_model-00004-of-00006.bin",
233
- "h.23.attn.c_proj.bias": "pytorch_model-00004-of-00006.bin",
234
- "h.23.attn.c_proj.weight": "pytorch_model-00004-of-00006.bin",
235
- "h.23.attn.masked_bias": "pytorch_model-00004-of-00006.bin",
236
- "h.23.ln_1.bias": "pytorch_model-00004-of-00006.bin",
237
- "h.23.ln_1.weight": "pytorch_model-00004-of-00006.bin",
238
- "h.23.ln_2.bias": "pytorch_model-00004-of-00006.bin",
239
- "h.23.ln_2.weight": "pytorch_model-00004-of-00006.bin",
240
- "h.23.mlp.c_fc.bias": "pytorch_model-00004-of-00006.bin",
241
- "h.23.mlp.c_fc.weight": "pytorch_model-00004-of-00006.bin",
242
- "h.23.mlp.c_proj.bias": "pytorch_model-00004-of-00006.bin",
243
- "h.23.mlp.c_proj.weight": "pytorch_model-00004-of-00006.bin",
244
- "h.24.attn.bias": "pytorch_model-00004-of-00006.bin",
245
- "h.24.attn.c_attn.bias": "pytorch_model-00004-of-00006.bin",
246
- "h.24.attn.c_attn.weight": "pytorch_model-00004-of-00006.bin",
247
- "h.24.attn.c_proj.bias": "pytorch_model-00004-of-00006.bin",
248
- "h.24.attn.c_proj.weight": "pytorch_model-00004-of-00006.bin",
249
- "h.24.attn.masked_bias": "pytorch_model-00004-of-00006.bin",
250
- "h.24.ln_1.bias": "pytorch_model-00004-of-00006.bin",
251
- "h.24.ln_1.weight": "pytorch_model-00004-of-00006.bin",
252
- "h.24.ln_2.bias": "pytorch_model-00004-of-00006.bin",
253
- "h.24.ln_2.weight": "pytorch_model-00004-of-00006.bin",
254
- "h.24.mlp.c_fc.bias": "pytorch_model-00004-of-00006.bin",
255
- "h.24.mlp.c_fc.weight": "pytorch_model-00004-of-00006.bin",
256
- "h.24.mlp.c_proj.bias": "pytorch_model-00004-of-00006.bin",
257
- "h.24.mlp.c_proj.weight": "pytorch_model-00004-of-00006.bin",
258
- "h.25.attn.bias": "pytorch_model-00004-of-00006.bin",
259
- "h.25.attn.c_attn.bias": "pytorch_model-00004-of-00006.bin",
260
- "h.25.attn.c_attn.weight": "pytorch_model-00004-of-00006.bin",
261
- "h.25.attn.c_proj.bias": "pytorch_model-00004-of-00006.bin",
262
- "h.25.attn.c_proj.weight": "pytorch_model-00004-of-00006.bin",
263
- "h.25.attn.masked_bias": "pytorch_model-00004-of-00006.bin",
264
- "h.25.ln_1.bias": "pytorch_model-00004-of-00006.bin",
265
- "h.25.ln_1.weight": "pytorch_model-00004-of-00006.bin",
266
- "h.25.ln_2.bias": "pytorch_model-00004-of-00006.bin",
267
- "h.25.ln_2.weight": "pytorch_model-00004-of-00006.bin",
268
- "h.25.mlp.c_fc.bias": "pytorch_model-00004-of-00006.bin",
269
- "h.25.mlp.c_fc.weight": "pytorch_model-00004-of-00006.bin",
270
- "h.25.mlp.c_proj.bias": "pytorch_model-00004-of-00006.bin",
271
- "h.25.mlp.c_proj.weight": "pytorch_model-00004-of-00006.bin",
272
- "h.26.attn.bias": "pytorch_model-00004-of-00006.bin",
273
- "h.26.attn.c_attn.bias": "pytorch_model-00004-of-00006.bin",
274
- "h.26.attn.c_attn.weight": "pytorch_model-00004-of-00006.bin",
275
- "h.26.attn.c_proj.bias": "pytorch_model-00004-of-00006.bin",
276
- "h.26.attn.c_proj.weight": "pytorch_model-00004-of-00006.bin",
277
- "h.26.attn.masked_bias": "pytorch_model-00004-of-00006.bin",
278
- "h.26.ln_1.bias": "pytorch_model-00004-of-00006.bin",
279
- "h.26.ln_1.weight": "pytorch_model-00004-of-00006.bin",
280
- "h.26.ln_2.bias": "pytorch_model-00004-of-00006.bin",
281
- "h.26.ln_2.weight": "pytorch_model-00004-of-00006.bin",
282
- "h.26.mlp.c_fc.bias": "pytorch_model-00004-of-00006.bin",
283
- "h.26.mlp.c_fc.weight": "pytorch_model-00004-of-00006.bin",
284
- "h.26.mlp.c_proj.bias": "pytorch_model-00004-of-00006.bin",
285
- "h.26.mlp.c_proj.weight": "pytorch_model-00004-of-00006.bin",
286
- "h.27.attn.bias": "pytorch_model-00004-of-00006.bin",
287
- "h.27.attn.c_attn.bias": "pytorch_model-00004-of-00006.bin",
288
- "h.27.attn.c_attn.weight": "pytorch_model-00004-of-00006.bin",
289
- "h.27.attn.c_proj.bias": "pytorch_model-00004-of-00006.bin",
290
- "h.27.attn.c_proj.weight": "pytorch_model-00004-of-00006.bin",
291
- "h.27.attn.masked_bias": "pytorch_model-00004-of-00006.bin",
292
- "h.27.ln_1.bias": "pytorch_model-00004-of-00006.bin",
293
- "h.27.ln_1.weight": "pytorch_model-00004-of-00006.bin",
294
- "h.27.ln_2.bias": "pytorch_model-00004-of-00006.bin",
295
- "h.27.ln_2.weight": "pytorch_model-00004-of-00006.bin",
296
- "h.27.mlp.c_fc.bias": "pytorch_model-00004-of-00006.bin",
297
- "h.27.mlp.c_fc.weight": "pytorch_model-00004-of-00006.bin",
298
- "h.27.mlp.c_proj.bias": "pytorch_model-00004-of-00006.bin",
299
- "h.27.mlp.c_proj.weight": "pytorch_model-00004-of-00006.bin",
300
- "h.28.attn.bias": "pytorch_model-00004-of-00006.bin",
301
- "h.28.attn.c_attn.bias": "pytorch_model-00004-of-00006.bin",
302
- "h.28.attn.c_attn.weight": "pytorch_model-00004-of-00006.bin",
303
- "h.28.attn.c_proj.bias": "pytorch_model-00004-of-00006.bin",
304
- "h.28.attn.c_proj.weight": "pytorch_model-00004-of-00006.bin",
305
- "h.28.attn.masked_bias": "pytorch_model-00004-of-00006.bin",
306
- "h.28.ln_1.bias": "pytorch_model-00004-of-00006.bin",
307
- "h.28.ln_1.weight": "pytorch_model-00004-of-00006.bin",
308
- "h.28.ln_2.bias": "pytorch_model-00004-of-00006.bin",
309
- "h.28.ln_2.weight": "pytorch_model-00004-of-00006.bin",
310
- "h.28.mlp.c_fc.bias": "pytorch_model-00005-of-00006.bin",
311
- "h.28.mlp.c_fc.weight": "pytorch_model-00005-of-00006.bin",
312
- "h.28.mlp.c_proj.bias": "pytorch_model-00005-of-00006.bin",
313
- "h.28.mlp.c_proj.weight": "pytorch_model-00005-of-00006.bin",
314
- "h.29.attn.bias": "pytorch_model-00005-of-00006.bin",
315
- "h.29.attn.c_attn.bias": "pytorch_model-00005-of-00006.bin",
316
- "h.29.attn.c_attn.weight": "pytorch_model-00005-of-00006.bin",
317
- "h.29.attn.c_proj.bias": "pytorch_model-00005-of-00006.bin",
318
- "h.29.attn.c_proj.weight": "pytorch_model-00005-of-00006.bin",
319
- "h.29.attn.masked_bias": "pytorch_model-00005-of-00006.bin",
320
- "h.29.ln_1.bias": "pytorch_model-00005-of-00006.bin",
321
- "h.29.ln_1.weight": "pytorch_model-00005-of-00006.bin",
322
- "h.29.ln_2.bias": "pytorch_model-00005-of-00006.bin",
323
- "h.29.ln_2.weight": "pytorch_model-00005-of-00006.bin",
324
- "h.29.mlp.c_fc.bias": "pytorch_model-00005-of-00006.bin",
325
- "h.29.mlp.c_fc.weight": "pytorch_model-00005-of-00006.bin",
326
- "h.29.mlp.c_proj.bias": "pytorch_model-00005-of-00006.bin",
327
- "h.29.mlp.c_proj.weight": "pytorch_model-00005-of-00006.bin",
328
- "h.3.attn.bias": "pytorch_model-00001-of-00006.bin",
329
- "h.3.attn.c_attn.bias": "pytorch_model-00001-of-00006.bin",
330
- "h.3.attn.c_attn.weight": "pytorch_model-00001-of-00006.bin",
331
- "h.3.attn.c_proj.bias": "pytorch_model-00001-of-00006.bin",
332
- "h.3.attn.c_proj.weight": "pytorch_model-00001-of-00006.bin",
333
- "h.3.attn.masked_bias": "pytorch_model-00001-of-00006.bin",
334
- "h.3.ln_1.bias": "pytorch_model-00001-of-00006.bin",
335
- "h.3.ln_1.weight": "pytorch_model-00001-of-00006.bin",
336
- "h.3.ln_2.bias": "pytorch_model-00001-of-00006.bin",
337
- "h.3.ln_2.weight": "pytorch_model-00001-of-00006.bin",
338
- "h.3.mlp.c_fc.bias": "pytorch_model-00001-of-00006.bin",
339
- "h.3.mlp.c_fc.weight": "pytorch_model-00001-of-00006.bin",
340
- "h.3.mlp.c_proj.bias": "pytorch_model-00001-of-00006.bin",
341
- "h.3.mlp.c_proj.weight": "pytorch_model-00001-of-00006.bin",
342
- "h.30.attn.bias": "pytorch_model-00005-of-00006.bin",
343
- "h.30.attn.c_attn.bias": "pytorch_model-00005-of-00006.bin",
344
- "h.30.attn.c_attn.weight": "pytorch_model-00005-of-00006.bin",
345
- "h.30.attn.c_proj.bias": "pytorch_model-00005-of-00006.bin",
346
- "h.30.attn.c_proj.weight": "pytorch_model-00005-of-00006.bin",
347
- "h.30.attn.masked_bias": "pytorch_model-00005-of-00006.bin",
348
- "h.30.ln_1.bias": "pytorch_model-00005-of-00006.bin",
349
- "h.30.ln_1.weight": "pytorch_model-00005-of-00006.bin",
350
- "h.30.ln_2.bias": "pytorch_model-00005-of-00006.bin",
351
- "h.30.ln_2.weight": "pytorch_model-00005-of-00006.bin",
352
- "h.30.mlp.c_fc.bias": "pytorch_model-00005-of-00006.bin",
353
- "h.30.mlp.c_fc.weight": "pytorch_model-00005-of-00006.bin",
354
- "h.30.mlp.c_proj.bias": "pytorch_model-00005-of-00006.bin",
355
- "h.30.mlp.c_proj.weight": "pytorch_model-00005-of-00006.bin",
356
- "h.31.attn.bias": "pytorch_model-00005-of-00006.bin",
357
- "h.31.attn.c_attn.bias": "pytorch_model-00005-of-00006.bin",
358
- "h.31.attn.c_attn.weight": "pytorch_model-00005-of-00006.bin",
359
- "h.31.attn.c_proj.bias": "pytorch_model-00005-of-00006.bin",
360
- "h.31.attn.c_proj.weight": "pytorch_model-00005-of-00006.bin",
361
- "h.31.attn.masked_bias": "pytorch_model-00005-of-00006.bin",
362
- "h.31.ln_1.bias": "pytorch_model-00005-of-00006.bin",
363
- "h.31.ln_1.weight": "pytorch_model-00005-of-00006.bin",
364
- "h.31.ln_2.bias": "pytorch_model-00005-of-00006.bin",
365
- "h.31.ln_2.weight": "pytorch_model-00005-of-00006.bin",
366
- "h.31.mlp.c_fc.bias": "pytorch_model-00005-of-00006.bin",
367
- "h.31.mlp.c_fc.weight": "pytorch_model-00005-of-00006.bin",
368
- "h.31.mlp.c_proj.bias": "pytorch_model-00005-of-00006.bin",
369
- "h.31.mlp.c_proj.weight": "pytorch_model-00005-of-00006.bin",
370
- "h.32.attn.bias": "pytorch_model-00005-of-00006.bin",
371
- "h.32.attn.c_attn.bias": "pytorch_model-00005-of-00006.bin",
372
- "h.32.attn.c_attn.weight": "pytorch_model-00005-of-00006.bin",
373
- "h.32.attn.c_proj.bias": "pytorch_model-00005-of-00006.bin",
374
- "h.32.attn.c_proj.weight": "pytorch_model-00005-of-00006.bin",
375
- "h.32.attn.masked_bias": "pytorch_model-00005-of-00006.bin",
376
- "h.32.ln_1.bias": "pytorch_model-00005-of-00006.bin",
377
- "h.32.ln_1.weight": "pytorch_model-00005-of-00006.bin",
378
- "h.32.ln_2.bias": "pytorch_model-00005-of-00006.bin",
379
- "h.32.ln_2.weight": "pytorch_model-00005-of-00006.bin",
380
- "h.32.mlp.c_fc.bias": "pytorch_model-00005-of-00006.bin",
381
- "h.32.mlp.c_fc.weight": "pytorch_model-00005-of-00006.bin",
382
- "h.32.mlp.c_proj.bias": "pytorch_model-00005-of-00006.bin",
383
- "h.32.mlp.c_proj.weight": "pytorch_model-00005-of-00006.bin",
384
- "h.33.attn.bias": "pytorch_model-00005-of-00006.bin",
385
- "h.33.attn.c_attn.bias": "pytorch_model-00005-of-00006.bin",
386
- "h.33.attn.c_attn.weight": "pytorch_model-00005-of-00006.bin",
387
- "h.33.attn.c_proj.bias": "pytorch_model-00005-of-00006.bin",
388
- "h.33.attn.c_proj.weight": "pytorch_model-00005-of-00006.bin",
389
- "h.33.attn.masked_bias": "pytorch_model-00005-of-00006.bin",
390
- "h.33.ln_1.bias": "pytorch_model-00005-of-00006.bin",
391
- "h.33.ln_1.weight": "pytorch_model-00005-of-00006.bin",
392
- "h.33.ln_2.bias": "pytorch_model-00005-of-00006.bin",
393
- "h.33.ln_2.weight": "pytorch_model-00005-of-00006.bin",
394
- "h.33.mlp.c_fc.bias": "pytorch_model-00005-of-00006.bin",
395
- "h.33.mlp.c_fc.weight": "pytorch_model-00005-of-00006.bin",
396
- "h.33.mlp.c_proj.bias": "pytorch_model-00005-of-00006.bin",
397
- "h.33.mlp.c_proj.weight": "pytorch_model-00005-of-00006.bin",
398
- "h.34.attn.bias": "pytorch_model-00005-of-00006.bin",
399
- "h.34.attn.c_attn.bias": "pytorch_model-00005-of-00006.bin",
400
- "h.34.attn.c_attn.weight": "pytorch_model-00005-of-00006.bin",
401
- "h.34.attn.c_proj.bias": "pytorch_model-00005-of-00006.bin",
402
- "h.34.attn.c_proj.weight": "pytorch_model-00005-of-00006.bin",
403
- "h.34.attn.masked_bias": "pytorch_model-00005-of-00006.bin",
404
- "h.34.ln_1.bias": "pytorch_model-00005-of-00006.bin",
405
- "h.34.ln_1.weight": "pytorch_model-00005-of-00006.bin",
406
- "h.34.ln_2.bias": "pytorch_model-00005-of-00006.bin",
407
- "h.34.ln_2.weight": "pytorch_model-00005-of-00006.bin",
408
- "h.34.mlp.c_fc.bias": "pytorch_model-00005-of-00006.bin",
409
- "h.34.mlp.c_fc.weight": "pytorch_model-00005-of-00006.bin",
410
- "h.34.mlp.c_proj.bias": "pytorch_model-00005-of-00006.bin",
411
- "h.34.mlp.c_proj.weight": "pytorch_model-00005-of-00006.bin",
412
- "h.35.attn.bias": "pytorch_model-00005-of-00006.bin",
413
- "h.35.attn.c_attn.bias": "pytorch_model-00005-of-00006.bin",
414
- "h.35.attn.c_attn.weight": "pytorch_model-00005-of-00006.bin",
415
- "h.35.attn.c_proj.bias": "pytorch_model-00005-of-00006.bin",
416
- "h.35.attn.c_proj.weight": "pytorch_model-00005-of-00006.bin",
417
- "h.35.attn.masked_bias": "pytorch_model-00005-of-00006.bin",
418
- "h.35.ln_1.bias": "pytorch_model-00005-of-00006.bin",
419
- "h.35.ln_1.weight": "pytorch_model-00005-of-00006.bin",
420
- "h.35.ln_2.bias": "pytorch_model-00005-of-00006.bin",
421
- "h.35.ln_2.weight": "pytorch_model-00005-of-00006.bin",
422
- "h.35.mlp.c_fc.bias": "pytorch_model-00005-of-00006.bin",
423
- "h.35.mlp.c_fc.weight": "pytorch_model-00005-of-00006.bin",
424
- "h.35.mlp.c_proj.bias": "pytorch_model-00006-of-00006.bin",
425
- "h.35.mlp.c_proj.weight": "pytorch_model-00006-of-00006.bin",
426
- "h.36.attn.bias": "pytorch_model-00006-of-00006.bin",
427
- "h.36.attn.c_attn.bias": "pytorch_model-00006-of-00006.bin",
428
- "h.36.attn.c_attn.weight": "pytorch_model-00006-of-00006.bin",
429
- "h.36.attn.c_proj.bias": "pytorch_model-00006-of-00006.bin",
430
- "h.36.attn.c_proj.weight": "pytorch_model-00006-of-00006.bin",
431
- "h.36.attn.masked_bias": "pytorch_model-00006-of-00006.bin",
432
- "h.36.ln_1.bias": "pytorch_model-00006-of-00006.bin",
433
- "h.36.ln_1.weight": "pytorch_model-00006-of-00006.bin",
434
- "h.36.ln_2.bias": "pytorch_model-00006-of-00006.bin",
435
- "h.36.ln_2.weight": "pytorch_model-00006-of-00006.bin",
436
- "h.36.mlp.c_fc.bias": "pytorch_model-00006-of-00006.bin",
437
- "h.36.mlp.c_fc.weight": "pytorch_model-00006-of-00006.bin",
438
- "h.36.mlp.c_proj.bias": "pytorch_model-00006-of-00006.bin",
439
- "h.36.mlp.c_proj.weight": "pytorch_model-00006-of-00006.bin",
440
- "h.37.attn.bias": "pytorch_model-00006-of-00006.bin",
441
- "h.37.attn.c_attn.bias": "pytorch_model-00006-of-00006.bin",
442
- "h.37.attn.c_attn.weight": "pytorch_model-00006-of-00006.bin",
443
- "h.37.attn.c_proj.bias": "pytorch_model-00006-of-00006.bin",
444
- "h.37.attn.c_proj.weight": "pytorch_model-00006-of-00006.bin",
445
- "h.37.attn.masked_bias": "pytorch_model-00006-of-00006.bin",
446
- "h.37.ln_1.bias": "pytorch_model-00006-of-00006.bin",
447
- "h.37.ln_1.weight": "pytorch_model-00006-of-00006.bin",
448
- "h.37.ln_2.bias": "pytorch_model-00006-of-00006.bin",
449
- "h.37.ln_2.weight": "pytorch_model-00006-of-00006.bin",
450
- "h.37.mlp.c_fc.bias": "pytorch_model-00006-of-00006.bin",
451
- "h.37.mlp.c_fc.weight": "pytorch_model-00006-of-00006.bin",
452
- "h.37.mlp.c_proj.bias": "pytorch_model-00006-of-00006.bin",
453
- "h.37.mlp.c_proj.weight": "pytorch_model-00006-of-00006.bin",
454
- "h.38.attn.bias": "pytorch_model-00006-of-00006.bin",
455
- "h.38.attn.c_attn.bias": "pytorch_model-00006-of-00006.bin",
456
- "h.38.attn.c_attn.weight": "pytorch_model-00006-of-00006.bin",
457
- "h.38.attn.c_proj.bias": "pytorch_model-00006-of-00006.bin",
458
- "h.38.attn.c_proj.weight": "pytorch_model-00006-of-00006.bin",
459
- "h.38.attn.masked_bias": "pytorch_model-00006-of-00006.bin",
460
- "h.38.ln_1.bias": "pytorch_model-00006-of-00006.bin",
461
- "h.38.ln_1.weight": "pytorch_model-00006-of-00006.bin",
462
- "h.38.ln_2.bias": "pytorch_model-00006-of-00006.bin",
463
- "h.38.ln_2.weight": "pytorch_model-00006-of-00006.bin",
464
- "h.38.mlp.c_fc.bias": "pytorch_model-00006-of-00006.bin",
465
- "h.38.mlp.c_fc.weight": "pytorch_model-00006-of-00006.bin",
466
- "h.38.mlp.c_proj.bias": "pytorch_model-00006-of-00006.bin",
467
- "h.38.mlp.c_proj.weight": "pytorch_model-00006-of-00006.bin",
468
- "h.39.attn.bias": "pytorch_model-00006-of-00006.bin",
469
- "h.39.attn.c_attn.bias": "pytorch_model-00006-of-00006.bin",
470
- "h.39.attn.c_attn.weight": "pytorch_model-00006-of-00006.bin",
471
- "h.39.attn.c_proj.bias": "pytorch_model-00006-of-00006.bin",
472
- "h.39.attn.c_proj.weight": "pytorch_model-00006-of-00006.bin",
473
- "h.39.attn.masked_bias": "pytorch_model-00006-of-00006.bin",
474
- "h.39.ln_1.bias": "pytorch_model-00006-of-00006.bin",
475
- "h.39.ln_1.weight": "pytorch_model-00006-of-00006.bin",
476
- "h.39.ln_2.bias": "pytorch_model-00006-of-00006.bin",
477
- "h.39.ln_2.weight": "pytorch_model-00006-of-00006.bin",
478
- "h.39.mlp.c_fc.bias": "pytorch_model-00006-of-00006.bin",
479
- "h.39.mlp.c_fc.weight": "pytorch_model-00006-of-00006.bin",
480
- "h.39.mlp.c_proj.bias": "pytorch_model-00006-of-00006.bin",
481
- "h.39.mlp.c_proj.weight": "pytorch_model-00006-of-00006.bin",
482
- "h.4.attn.bias": "pytorch_model-00001-of-00006.bin",
483
- "h.4.attn.c_attn.bias": "pytorch_model-00001-of-00006.bin",
484
- "h.4.attn.c_attn.weight": "pytorch_model-00001-of-00006.bin",
485
- "h.4.attn.c_proj.bias": "pytorch_model-00001-of-00006.bin",
486
- "h.4.attn.c_proj.weight": "pytorch_model-00001-of-00006.bin",
487
- "h.4.attn.masked_bias": "pytorch_model-00001-of-00006.bin",
488
- "h.4.ln_1.bias": "pytorch_model-00001-of-00006.bin",
489
- "h.4.ln_1.weight": "pytorch_model-00001-of-00006.bin",
490
- "h.4.ln_2.bias": "pytorch_model-00001-of-00006.bin",
491
- "h.4.ln_2.weight": "pytorch_model-00001-of-00006.bin",
492
- "h.4.mlp.c_fc.bias": "pytorch_model-00001-of-00006.bin",
493
- "h.4.mlp.c_fc.weight": "pytorch_model-00001-of-00006.bin",
494
- "h.4.mlp.c_proj.bias": "pytorch_model-00001-of-00006.bin",
495
- "h.4.mlp.c_proj.weight": "pytorch_model-00001-of-00006.bin",
496
- "h.5.attn.bias": "pytorch_model-00001-of-00006.bin",
497
- "h.5.attn.c_attn.bias": "pytorch_model-00001-of-00006.bin",
498
- "h.5.attn.c_attn.weight": "pytorch_model-00001-of-00006.bin",
499
- "h.5.attn.c_proj.bias": "pytorch_model-00001-of-00006.bin",
500
- "h.5.attn.c_proj.weight": "pytorch_model-00001-of-00006.bin",
501
- "h.5.attn.masked_bias": "pytorch_model-00001-of-00006.bin",
502
- "h.5.ln_1.bias": "pytorch_model-00001-of-00006.bin",
503
- "h.5.ln_1.weight": "pytorch_model-00001-of-00006.bin",
504
- "h.5.ln_2.bias": "pytorch_model-00001-of-00006.bin",
505
- "h.5.ln_2.weight": "pytorch_model-00001-of-00006.bin",
506
- "h.5.mlp.c_fc.bias": "pytorch_model-00001-of-00006.bin",
507
- "h.5.mlp.c_fc.weight": "pytorch_model-00001-of-00006.bin",
508
- "h.5.mlp.c_proj.bias": "pytorch_model-00001-of-00006.bin",
509
- "h.5.mlp.c_proj.weight": "pytorch_model-00001-of-00006.bin",
510
- "h.6.attn.bias": "pytorch_model-00001-of-00006.bin",
511
- "h.6.attn.c_attn.bias": "pytorch_model-00001-of-00006.bin",
512
- "h.6.attn.c_attn.weight": "pytorch_model-00001-of-00006.bin",
513
- "h.6.attn.c_proj.bias": "pytorch_model-00001-of-00006.bin",
514
- "h.6.attn.c_proj.weight": "pytorch_model-00001-of-00006.bin",
515
- "h.6.attn.masked_bias": "pytorch_model-00001-of-00006.bin",
516
- "h.6.ln_1.bias": "pytorch_model-00001-of-00006.bin",
517
- "h.6.ln_1.weight": "pytorch_model-00001-of-00006.bin",
518
- "h.6.ln_2.bias": "pytorch_model-00001-of-00006.bin",
519
- "h.6.ln_2.weight": "pytorch_model-00001-of-00006.bin",
520
- "h.6.mlp.c_fc.bias": "pytorch_model-00002-of-00006.bin",
521
- "h.6.mlp.c_fc.weight": "pytorch_model-00002-of-00006.bin",
522
- "h.6.mlp.c_proj.bias": "pytorch_model-00002-of-00006.bin",
523
- "h.6.mlp.c_proj.weight": "pytorch_model-00002-of-00006.bin",
524
- "h.7.attn.bias": "pytorch_model-00002-of-00006.bin",
525
- "h.7.attn.c_attn.bias": "pytorch_model-00002-of-00006.bin",
526
- "h.7.attn.c_attn.weight": "pytorch_model-00002-of-00006.bin",
527
- "h.7.attn.c_proj.bias": "pytorch_model-00002-of-00006.bin",
528
- "h.7.attn.c_proj.weight": "pytorch_model-00002-of-00006.bin",
529
- "h.7.attn.masked_bias": "pytorch_model-00002-of-00006.bin",
530
- "h.7.ln_1.bias": "pytorch_model-00002-of-00006.bin",
531
- "h.7.ln_1.weight": "pytorch_model-00002-of-00006.bin",
532
- "h.7.ln_2.bias": "pytorch_model-00002-of-00006.bin",
533
- "h.7.ln_2.weight": "pytorch_model-00002-of-00006.bin",
534
- "h.7.mlp.c_fc.bias": "pytorch_model-00002-of-00006.bin",
535
- "h.7.mlp.c_fc.weight": "pytorch_model-00002-of-00006.bin",
536
- "h.7.mlp.c_proj.bias": "pytorch_model-00002-of-00006.bin",
537
- "h.7.mlp.c_proj.weight": "pytorch_model-00002-of-00006.bin",
538
- "h.8.attn.bias": "pytorch_model-00002-of-00006.bin",
539
- "h.8.attn.c_attn.bias": "pytorch_model-00002-of-00006.bin",
540
- "h.8.attn.c_attn.weight": "pytorch_model-00002-of-00006.bin",
541
- "h.8.attn.c_proj.bias": "pytorch_model-00002-of-00006.bin",
542
- "h.8.attn.c_proj.weight": "pytorch_model-00002-of-00006.bin",
543
- "h.8.attn.masked_bias": "pytorch_model-00002-of-00006.bin",
544
- "h.8.ln_1.bias": "pytorch_model-00002-of-00006.bin",
545
- "h.8.ln_1.weight": "pytorch_model-00002-of-00006.bin",
546
- "h.8.ln_2.bias": "pytorch_model-00002-of-00006.bin",
547
- "h.8.ln_2.weight": "pytorch_model-00002-of-00006.bin",
548
- "h.8.mlp.c_fc.bias": "pytorch_model-00002-of-00006.bin",
549
- "h.8.mlp.c_fc.weight": "pytorch_model-00002-of-00006.bin",
550
- "h.8.mlp.c_proj.bias": "pytorch_model-00002-of-00006.bin",
551
- "h.8.mlp.c_proj.weight": "pytorch_model-00002-of-00006.bin",
552
- "h.9.attn.bias": "pytorch_model-00002-of-00006.bin",
553
- "h.9.attn.c_attn.bias": "pytorch_model-00002-of-00006.bin",
554
- "h.9.attn.c_attn.weight": "pytorch_model-00002-of-00006.bin",
555
- "h.9.attn.c_proj.bias": "pytorch_model-00002-of-00006.bin",
556
- "h.9.attn.c_proj.weight": "pytorch_model-00002-of-00006.bin",
557
- "h.9.attn.masked_bias": "pytorch_model-00002-of-00006.bin",
558
- "h.9.ln_1.bias": "pytorch_model-00002-of-00006.bin",
559
- "h.9.ln_1.weight": "pytorch_model-00002-of-00006.bin",
560
- "h.9.ln_2.bias": "pytorch_model-00002-of-00006.bin",
561
- "h.9.ln_2.weight": "pytorch_model-00002-of-00006.bin",
562
- "h.9.mlp.c_fc.bias": "pytorch_model-00002-of-00006.bin",
563
- "h.9.mlp.c_fc.weight": "pytorch_model-00002-of-00006.bin",
564
- "h.9.mlp.c_proj.bias": "pytorch_model-00002-of-00006.bin",
565
- "h.9.mlp.c_proj.weight": "pytorch_model-00002-of-00006.bin",
566
- "ln_f.bias": "pytorch_model-00006-of-00006.bin",
567
- "ln_f.weight": "pytorch_model-00006-of-00006.bin",
568
- "wpe.weight": "pytorch_model-00001-of-00006.bin",
569
- "wte.weight": "pytorch_model-00001-of-00006.bin"
570
- }
571
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
special_tokens_map.json DELETED
@@ -1,5 +0,0 @@
1
- {
2
- "bos_token": "<|endoftext|>",
3
- "eos_token": "<|endoftext|>",
4
- "unk_token": "<|endoftext|>"
5
- }
 
 
 
 
 
 
tokenizer.json DELETED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json DELETED
@@ -1,10 +0,0 @@
1
- {
2
- "add_prefix_space": false,
3
- "bos_token": "<|endoftext|>",
4
- "eos_token": "<|endoftext|>",
5
- "model_max_length": 1024,
6
- "name_or_path": "gpt2",
7
- "special_tokens_map_file": null,
8
- "tokenizer_class": "GPT2Tokenizer",
9
- "unk_token": "<|endoftext|>"
10
- }
 
 
 
 
 
 
 
 
 
 
 
vocab.json DELETED
The diff for this file is too large to render. See raw diff