BAAI
/

PhyscalX commited on
Commit
b6f77bf
1 Parent(s): 78a5114

Initial commit

Browse files
.gitignore ADDED
@@ -0,0 +1,55 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Compiled Object files
2
+ *.slo
3
+ *.lo
4
+ *.o
5
+ *.cuo
6
+
7
+ # Compiled Dynamic libraries
8
+ *.so
9
+ *.dll
10
+ *.dylib
11
+
12
+ # Compiled Static libraries
13
+ *.lai
14
+ *.la
15
+ *.a
16
+ *.lib
17
+
18
+ # Compiled python
19
+ *.pyc
20
+ __pycache__
21
+
22
+ # Compiled MATLAB
23
+ *.mex*
24
+
25
+ # IPython notebook checkpoints
26
+ .ipynb_checkpoints
27
+
28
+ # Editor temporaries
29
+ *.swp
30
+ *~
31
+
32
+ # Sublime Text settings
33
+ *.sublime-workspace
34
+ *.sublime-project
35
+
36
+ # Eclipse Project settings
37
+ *.*project
38
+ .settings
39
+
40
+ # QtCreator files
41
+ *.user
42
+
43
+ # VSCode files
44
+ .vscode
45
+
46
+ # IDEA files
47
+ .idea
48
+
49
+ # OSX dir files
50
+ .DS_Store
51
+
52
+ # Android files
53
+ .gradle
54
+ *.iml
55
+ local.properties
LICENSE ADDED
@@ -0,0 +1,176 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Apache License
2
+ Version 2.0, January 2004
3
+ http://www.apache.org/licenses/
4
+
5
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
6
+
7
+ 1. Definitions.
8
+
9
+ "License" shall mean the terms and conditions for use, reproduction,
10
+ and distribution as defined by Sections 1 through 9 of this document.
11
+
12
+ "Licensor" shall mean the copyright owner or entity authorized by
13
+ the copyright owner that is granting the License.
14
+
15
+ "Legal Entity" shall mean the union of the acting entity and all
16
+ other entities that control, are controlled by, or are under common
17
+ control with that entity. For the purposes of this definition,
18
+ "control" means (i) the power, direct or indirect, to cause the
19
+ direction or management of such entity, whether by contract or
20
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
21
+ outstanding shares, or (iii) beneficial ownership of such entity.
22
+
23
+ "You" (or "Your") shall mean an individual or Legal Entity
24
+ exercising permissions granted by this License.
25
+
26
+ "Source" form shall mean the preferred form for making modifications,
27
+ including but not limited to software source code, documentation
28
+ source, and configuration files.
29
+
30
+ "Object" form shall mean any form resulting from mechanical
31
+ transformation or translation of a Source form, including but
32
+ not limited to compiled object code, generated documentation,
33
+ and conversions to other media types.
34
+
35
+ "Work" shall mean the work of authorship, whether in Source or
36
+ Object form, made available under the License, as indicated by a
37
+ copyright notice that is included in or attached to the work
38
+ (an example is provided in the Appendix below).
39
+
40
+ "Derivative Works" shall mean any work, whether in Source or Object
41
+ form, that is based on (or derived from) the Work and for which the
42
+ editorial revisions, annotations, elaborations, or other modifications
43
+ represent, as a whole, an original work of authorship. For the purposes
44
+ of this License, Derivative Works shall not include works that remain
45
+ separable from, or merely link (or bind by name) to the interfaces of,
46
+ the Work and Derivative Works thereof.
47
+
48
+ "Contribution" shall mean any work of authorship, including
49
+ the original version of the Work and any modifications or additions
50
+ to that Work or Derivative Works thereof, that is intentionally
51
+ submitted to Licensor for inclusion in the Work by the copyright owner
52
+ or by an individual or Legal Entity authorized to submit on behalf of
53
+ the copyright owner. For the purposes of this definition, "submitted"
54
+ means any form of electronic, verbal, or written communication sent
55
+ to the Licensor or its representatives, including but not limited to
56
+ communication on electronic mailing lists, source code control systems,
57
+ and issue tracking systems that are managed by, or on behalf of, the
58
+ Licensor for the purpose of discussing and improving the Work, but
59
+ excluding communication that is conspicuously marked or otherwise
60
+ designated in writing by the copyright owner as "Not a Contribution."
61
+
62
+ "Contributor" shall mean Licensor and any individual or Legal Entity
63
+ on behalf of whom a Contribution has been received by Licensor and
64
+ subsequently incorporated within the Work.
65
+
66
+ 2. Grant of Copyright License. Subject to the terms and conditions of
67
+ this License, each Contributor hereby grants to You a perpetual,
68
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
69
+ copyright license to reproduce, prepare Derivative Works of,
70
+ publicly display, publicly perform, sublicense, and distribute the
71
+ Work and such Derivative Works in Source or Object form.
72
+
73
+ 3. Grant of Patent License. Subject to the terms and conditions of
74
+ this License, each Contributor hereby grants to You a perpetual,
75
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
76
+ (except as stated in this section) patent license to make, have made,
77
+ use, offer to sell, sell, import, and otherwise transfer the Work,
78
+ where such license applies only to those patent claims licensable
79
+ by such Contributor that are necessarily infringed by their
80
+ Contribution(s) alone or by combination of their Contribution(s)
81
+ with the Work to which such Contribution(s) was submitted. If You
82
+ institute patent litigation against any entity (including a
83
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
84
+ or a Contribution incorporated within the Work constitutes direct
85
+ or contributory patent infringement, then any patent licenses
86
+ granted to You under this License for that Work shall terminate
87
+ as of the date such litigation is filed.
88
+
89
+ 4. Redistribution. You may reproduce and distribute copies of the
90
+ Work or Derivative Works thereof in any medium, with or without
91
+ modifications, and in Source or Object form, provided that You
92
+ meet the following conditions:
93
+
94
+ (a) You must give any other recipients of the Work or
95
+ Derivative Works a copy of this License; and
96
+
97
+ (b) You must cause any modified files to carry prominent notices
98
+ stating that You changed the files; and
99
+
100
+ (c) You must retain, in the Source form of any Derivative Works
101
+ that You distribute, all copyright, patent, trademark, and
102
+ attribution notices from the Source form of the Work,
103
+ excluding those notices that do not pertain to any part of
104
+ the Derivative Works; and
105
+
106
+ (d) If the Work includes a "NOTICE" text file as part of its
107
+ distribution, then any Derivative Works that You distribute must
108
+ include a readable copy of the attribution notices contained
109
+ within such NOTICE file, excluding those notices that do not
110
+ pertain to any part of the Derivative Works, in at least one
111
+ of the following places: within a NOTICE text file distributed
112
+ as part of the Derivative Works; within the Source form or
113
+ documentation, if provided along with the Derivative Works; or,
114
+ within a display generated by the Derivative Works, if and
115
+ wherever such third-party notices normally appear. The contents
116
+ of the NOTICE file are for informational purposes only and
117
+ do not modify the License. You may add Your own attribution
118
+ notices within Derivative Works that You distribute, alongside
119
+ or as an addendum to the NOTICE text from the Work, provided
120
+ that such additional attribution notices cannot be construed
121
+ as modifying the License.
122
+
123
+ You may add Your own copyright statement to Your modifications and
124
+ may provide additional or different license terms and conditions
125
+ for use, reproduction, or distribution of Your modifications, or
126
+ for any such Derivative Works as a whole, provided Your use,
127
+ reproduction, and distribution of the Work otherwise complies with
128
+ the conditions stated in this License.
129
+
130
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
131
+ any Contribution intentionally submitted for inclusion in the Work
132
+ by You to the Licensor shall be under the terms and conditions of
133
+ this License, without any additional terms or conditions.
134
+ Notwithstanding the above, nothing herein shall supersede or modify
135
+ the terms of any separate license agreement you may have executed
136
+ with Licensor regarding such Contributions.
137
+
138
+ 6. Trademarks. This License does not grant permission to use the trade
139
+ names, trademarks, service marks, or product names of the Licensor,
140
+ except as required for reasonable and customary use in describing the
141
+ origin of the Work and reproducing the content of the NOTICE file.
142
+
143
+ 7. Disclaimer of Warranty. Unless required by applicable law or
144
+ agreed to in writing, Licensor provides the Work (and each
145
+ Contributor provides its Contributions) on an "AS IS" BASIS,
146
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
147
+ implied, including, without limitation, any warranties or conditions
148
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
149
+ PARTICULAR PURPOSE. You are solely responsible for determining the
150
+ appropriateness of using or redistributing the Work and assume any
151
+ risks associated with Your exercise of permissions under this License.
152
+
153
+ 8. Limitation of Liability. In no event and under no legal theory,
154
+ whether in tort (including negligence), contract, or otherwise,
155
+ unless required by applicable law (such as deliberate and grossly
156
+ negligent acts) or agreed to in writing, shall any Contributor be
157
+ liable to You for damages, including any direct, indirect, special,
158
+ incidental, or consequential damages of any character arising as a
159
+ result of this License or out of the use or inability to use the
160
+ Work (including but not limited to damages for loss of goodwill,
161
+ work stoppage, computer failure or malfunction, or any and all
162
+ other commercial damages or losses), even if such Contributor
163
+ has been advised of the possibility of such damages.
164
+
165
+ 9. Accepting Warranty or Additional Liability. While redistributing
166
+ the Work or Derivative Works thereof, You may choose to offer,
167
+ and charge a fee for, acceptance of support, warranty, indemnity,
168
+ or other liability obligations and/or rights consistent with this
169
+ License. However, in accepting such obligations, You may act only
170
+ on Your own behalf and on Your sole responsibility, not on behalf
171
+ of any other Contributor, and only if You agree to indemnify,
172
+ defend, and hold each Contributor harmless for any liability
173
+ incurred by, or claims asserted against, such Contributor by reason
174
+ of your accepting any such warranty or additional liability.
175
+
176
+ END OF TERMS AND CONDITIONS
README.md CHANGED
@@ -1,3 +1,90 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - text-to-video
5
+ - video-generation
6
+ - baai-nova
7
+ ---
8
+
9
+ # NOVA (d48w1024-osp480) Model Card
10
+
11
+ ## Model Details
12
+ - **Developed by:** BAAI
13
+ - **Model type:** Masked Autoregressive Text-to-Video Generation Model
14
+ - **Model size:** 645M
15
+ - **Model precision:** torch.float16 (FP16)
16
+ - **Model resolution:** 768x480
17
+ - **Model Description:** This is a model that can be used to generate and modify videos based on text prompts. It is a [Masked Autoregressive (MAR)](https://arxiv.org/abs/2406.11838) diffusion model that uses a pretrained text encoder ([Phi-2](https://huggingface.co/microsoft/phi-2)) and one VAE video tokenizer ([OpenSoraPlanV1.2-VAE](https://huggingface.co/LanguageBind/Open-Sora-Plan-v1.2.0)).
18
+ - **Model License:** [Apache 2.0 License](LICENSE)
19
+ - **Resources for more information:** [GitHub Repository](https://github.com/baaivision/NOVA).
20
+
21
+ ## Examples
22
+
23
+ Using the [🤗's Diffusers library](https://github.com/huggingface/diffusers) to run NOVA in a simple and efficient manner.
24
+
25
+ ```bash
26
+ pip install diffusers transformers accelerate imageio[ffmpeg]
27
+ pip install git+ssh://git@github.com/baaivision/NOVA.git
28
+ ```
29
+
30
+ Running the pipeline:
31
+
32
+ ```python
33
+ import torch
34
+ from diffnext.pipelines import NOVAPipeline
35
+ from diffnext.utils import export_to_image, export_to_video
36
+
37
+ model_id = "BAAI/nova-d48w1024-osp480"
38
+ model_args = {"torch_dtype": torch.float16, "trust_remote_code": True}
39
+ pipe = NOVAPipeline.from_pretrained(model_id, **model_args)
40
+ pipe = pipe.to("cuda")
41
+
42
+ prompt = "Many spotted jellyfish pulsating under water."
43
+
44
+ image = pipe(prompt, max_latent_length=1).frames[0, 0]
45
+ export_to_image(image, "jellyfish.jpg")
46
+
47
+ video = pipe(prompt, max_latent_length=9).frames[0]
48
+ export_to_video(video, "jellyfish.mp4", fps=12)
49
+ ```
50
+
51
+ # Uses
52
+
53
+ ## Direct Use
54
+ The model is intended for research purposes only. Possible research areas and tasks include
55
+
56
+ - Research on generative models.
57
+ - Applications in educational or creative tools.
58
+ - Generation of artworks and use in design and other artistic processes.
59
+ - Probing and understanding the limitations and biases of generative models.
60
+ - Safe deployment of models which have the potential to generate harmful content.
61
+
62
+ Excluded uses are described below.
63
+
64
+ #### Out-of-Scope Use
65
+ The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
66
+
67
+ #### Misuse and Malicious Use
68
+ Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to:
69
+
70
+ - Mis- and disinformation.
71
+ - Representations of egregious violence and gore.
72
+ - Impersonating individuals without their consent.
73
+ - Sexual content without consent of the people who might see it.
74
+ - Sharing of copyrighted or licensed material in violation of its terms of use.
75
+ - Intentionally promoting or propagating discriminatory content or harmful stereotypes.
76
+ - Sharing content that is an alteration of copyrighted or licensed material in violation of its terms of use.
77
+ - Generating demeaning, dehumanizing, or otherwise harmful representations of people or their environments, cultures, religions, etc.
78
+
79
+ ## Limitations and Bias
80
+
81
+ ### Limitations
82
+
83
+ - The autoencoding part of the model is lossy.
84
+ - The model cannot render complex legible text.
85
+ - The model does not achieve perfect photorealism.
86
+ - The fingers, .etc in general may not be generated properly.
87
+ - The model was trained on a subset of the web datasets [LAION-5B](https://laion.ai/blog/laion-5b/) and [COYO-700M](https://github.com/kakaobrain/coyo-dataset), which contains adult, violent and sexual content.
88
+
89
+ ### Bias
90
+ While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases.
model_index.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_class_name": "NOVAPipeline",
3
+ "tokenizer": [
4
+ "transformers",
5
+ "CodeGenTokenizerFast"
6
+ ],
7
+ "scheduler": [
8
+ "diffusers",
9
+ "DDPMScheduler"
10
+ ],
11
+ "vae": [
12
+ "__vae__",
13
+ "AutoencoderKLOpenSora"
14
+ ],
15
+ "text_encoder": [
16
+ "__text_encoder__",
17
+ "PhiEncoderModel"
18
+ ],
19
+ "transformer": [
20
+ "__transformer__",
21
+ "NOVATransformer3DModel"
22
+ ]
23
+ }
scheduler/scheduler_config.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_class_name": "DDPMScheduler",
3
+ "beta_schedule": "squaredcos_cap_v2",
4
+ "num_train_timesteps": 1000,
5
+ "set_alpha_to_one": false,
6
+ "steps_offset": 1,
7
+ "clip_sample": false
8
+ }
text_encoder/__text_encoder__.py ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright (c) 2024-present, BAAI. All Rights Reserved.
2
+ #
3
+ # Licensed under the Apache License, Version 2.0 (the "License");
4
+ # you may not use this file except in compliance with the License.
5
+ # You may obtain a copy of the License at
6
+ #
7
+ # http://www.apache.org/licenses/LICENSE-2.0
8
+ #
9
+ # Unless required by applicable law or agreed to in writing, software
10
+ # distributed under the License is distributed on an "AS IS" BASIS,
11
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ # See the License for the specific language governing permissions and
13
+ # limitations under the License.
14
+ ##############################################################################
15
+ """Text encoder."""
16
+
17
+ from diffnext.models.text_encoders.phi import PhiEncoderModel # noqa
text_encoder/config.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "PhiEncoderModel"
4
+ ],
5
+ "bos_token_id": 50256,
6
+ "eos_token_id": 50256,
7
+ "pad_token_id": 50256,
8
+ "hidden_act": "gelu_pytorch_tanh",
9
+ "hidden_size": 2560,
10
+ "intermediate_size": 10240,
11
+ "layer_norm_eps": 1e-05,
12
+ "max_position_embeddings": 2048,
13
+ "model_type": "phi",
14
+ "num_attention_heads": 32,
15
+ "num_hidden_layers": 32,
16
+ "num_key_value_heads": 32,
17
+ "partial_rotary_factor": 0.4,
18
+ "rope_scaling": null,
19
+ "rope_theta": 10000.0,
20
+ "torch_dtype": "float16",
21
+ "use_cache": true,
22
+ "vocab_size": 51200
23
+ }
text_encoder/model-00001-of-00002.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7fbcdefa72edf7527bf5da40535b57d9f5bd3d16829b94a9d25d2b457df62e84
3
+ size 4995584424
text_encoder/model-00002-of-00002.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:17b98759e4b7953cbcf63ec49be7edbc9b863b57c207d84a52f5d2f5bcfcf6b4
3
+ size 563832976
text_encoder/model.safetensors.index.json ADDED
@@ -0,0 +1,460 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 5559367680
4
+ },
5
+ "weight_map": {
6
+ "lm_head.bias": "model-00002-of-00002.safetensors",
7
+ "lm_head.weight": "model-00002-of-00002.safetensors",
8
+ "model.embed_tokens.weight": "model-00001-of-00002.safetensors",
9
+ "model.final_layernorm.bias": "model-00002-of-00002.safetensors",
10
+ "model.final_layernorm.weight": "model-00002-of-00002.safetensors",
11
+ "model.layers.0.input_layernorm.bias": "model-00001-of-00002.safetensors",
12
+ "model.layers.0.input_layernorm.weight": "model-00001-of-00002.safetensors",
13
+ "model.layers.0.mlp.fc1.bias": "model-00001-of-00002.safetensors",
14
+ "model.layers.0.mlp.fc1.weight": "model-00001-of-00002.safetensors",
15
+ "model.layers.0.mlp.fc2.bias": "model-00001-of-00002.safetensors",
16
+ "model.layers.0.mlp.fc2.weight": "model-00001-of-00002.safetensors",
17
+ "model.layers.0.self_attn.dense.bias": "model-00001-of-00002.safetensors",
18
+ "model.layers.0.self_attn.dense.weight": "model-00001-of-00002.safetensors",
19
+ "model.layers.0.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
20
+ "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
21
+ "model.layers.0.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
22
+ "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
23
+ "model.layers.0.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
24
+ "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
25
+ "model.layers.1.input_layernorm.bias": "model-00001-of-00002.safetensors",
26
+ "model.layers.1.input_layernorm.weight": "model-00001-of-00002.safetensors",
27
+ "model.layers.1.mlp.fc1.bias": "model-00001-of-00002.safetensors",
28
+ "model.layers.1.mlp.fc1.weight": "model-00001-of-00002.safetensors",
29
+ "model.layers.1.mlp.fc2.bias": "model-00001-of-00002.safetensors",
30
+ "model.layers.1.mlp.fc2.weight": "model-00001-of-00002.safetensors",
31
+ "model.layers.1.self_attn.dense.bias": "model-00001-of-00002.safetensors",
32
+ "model.layers.1.self_attn.dense.weight": "model-00001-of-00002.safetensors",
33
+ "model.layers.1.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
34
+ "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
35
+ "model.layers.1.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
36
+ "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
37
+ "model.layers.1.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
38
+ "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
39
+ "model.layers.10.input_layernorm.bias": "model-00001-of-00002.safetensors",
40
+ "model.layers.10.input_layernorm.weight": "model-00001-of-00002.safetensors",
41
+ "model.layers.10.mlp.fc1.bias": "model-00001-of-00002.safetensors",
42
+ "model.layers.10.mlp.fc1.weight": "model-00001-of-00002.safetensors",
43
+ "model.layers.10.mlp.fc2.bias": "model-00001-of-00002.safetensors",
44
+ "model.layers.10.mlp.fc2.weight": "model-00001-of-00002.safetensors",
45
+ "model.layers.10.self_attn.dense.bias": "model-00001-of-00002.safetensors",
46
+ "model.layers.10.self_attn.dense.weight": "model-00001-of-00002.safetensors",
47
+ "model.layers.10.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
48
+ "model.layers.10.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
49
+ "model.layers.10.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
50
+ "model.layers.10.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
51
+ "model.layers.10.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
52
+ "model.layers.10.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
53
+ "model.layers.11.input_layernorm.bias": "model-00001-of-00002.safetensors",
54
+ "model.layers.11.input_layernorm.weight": "model-00001-of-00002.safetensors",
55
+ "model.layers.11.mlp.fc1.bias": "model-00001-of-00002.safetensors",
56
+ "model.layers.11.mlp.fc1.weight": "model-00001-of-00002.safetensors",
57
+ "model.layers.11.mlp.fc2.bias": "model-00001-of-00002.safetensors",
58
+ "model.layers.11.mlp.fc2.weight": "model-00001-of-00002.safetensors",
59
+ "model.layers.11.self_attn.dense.bias": "model-00001-of-00002.safetensors",
60
+ "model.layers.11.self_attn.dense.weight": "model-00001-of-00002.safetensors",
61
+ "model.layers.11.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
62
+ "model.layers.11.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
63
+ "model.layers.11.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
64
+ "model.layers.11.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
65
+ "model.layers.11.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
66
+ "model.layers.11.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
67
+ "model.layers.12.input_layernorm.bias": "model-00001-of-00002.safetensors",
68
+ "model.layers.12.input_layernorm.weight": "model-00001-of-00002.safetensors",
69
+ "model.layers.12.mlp.fc1.bias": "model-00001-of-00002.safetensors",
70
+ "model.layers.12.mlp.fc1.weight": "model-00001-of-00002.safetensors",
71
+ "model.layers.12.mlp.fc2.bias": "model-00001-of-00002.safetensors",
72
+ "model.layers.12.mlp.fc2.weight": "model-00001-of-00002.safetensors",
73
+ "model.layers.12.self_attn.dense.bias": "model-00001-of-00002.safetensors",
74
+ "model.layers.12.self_attn.dense.weight": "model-00001-of-00002.safetensors",
75
+ "model.layers.12.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
76
+ "model.layers.12.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
77
+ "model.layers.12.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
78
+ "model.layers.12.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
79
+ "model.layers.12.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
80
+ "model.layers.12.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
81
+ "model.layers.13.input_layernorm.bias": "model-00001-of-00002.safetensors",
82
+ "model.layers.13.input_layernorm.weight": "model-00001-of-00002.safetensors",
83
+ "model.layers.13.mlp.fc1.bias": "model-00001-of-00002.safetensors",
84
+ "model.layers.13.mlp.fc1.weight": "model-00001-of-00002.safetensors",
85
+ "model.layers.13.mlp.fc2.bias": "model-00001-of-00002.safetensors",
86
+ "model.layers.13.mlp.fc2.weight": "model-00001-of-00002.safetensors",
87
+ "model.layers.13.self_attn.dense.bias": "model-00001-of-00002.safetensors",
88
+ "model.layers.13.self_attn.dense.weight": "model-00001-of-00002.safetensors",
89
+ "model.layers.13.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
90
+ "model.layers.13.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
91
+ "model.layers.13.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
92
+ "model.layers.13.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
93
+ "model.layers.13.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
94
+ "model.layers.13.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
95
+ "model.layers.14.input_layernorm.bias": "model-00001-of-00002.safetensors",
96
+ "model.layers.14.input_layernorm.weight": "model-00001-of-00002.safetensors",
97
+ "model.layers.14.mlp.fc1.bias": "model-00001-of-00002.safetensors",
98
+ "model.layers.14.mlp.fc1.weight": "model-00001-of-00002.safetensors",
99
+ "model.layers.14.mlp.fc2.bias": "model-00001-of-00002.safetensors",
100
+ "model.layers.14.mlp.fc2.weight": "model-00001-of-00002.safetensors",
101
+ "model.layers.14.self_attn.dense.bias": "model-00001-of-00002.safetensors",
102
+ "model.layers.14.self_attn.dense.weight": "model-00001-of-00002.safetensors",
103
+ "model.layers.14.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
104
+ "model.layers.14.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
105
+ "model.layers.14.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
106
+ "model.layers.14.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
107
+ "model.layers.14.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
108
+ "model.layers.14.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
109
+ "model.layers.15.input_layernorm.bias": "model-00001-of-00002.safetensors",
110
+ "model.layers.15.input_layernorm.weight": "model-00001-of-00002.safetensors",
111
+ "model.layers.15.mlp.fc1.bias": "model-00001-of-00002.safetensors",
112
+ "model.layers.15.mlp.fc1.weight": "model-00001-of-00002.safetensors",
113
+ "model.layers.15.mlp.fc2.bias": "model-00001-of-00002.safetensors",
114
+ "model.layers.15.mlp.fc2.weight": "model-00001-of-00002.safetensors",
115
+ "model.layers.15.self_attn.dense.bias": "model-00001-of-00002.safetensors",
116
+ "model.layers.15.self_attn.dense.weight": "model-00001-of-00002.safetensors",
117
+ "model.layers.15.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
118
+ "model.layers.15.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
119
+ "model.layers.15.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
120
+ "model.layers.15.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
121
+ "model.layers.15.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
122
+ "model.layers.15.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
123
+ "model.layers.16.input_layernorm.bias": "model-00001-of-00002.safetensors",
124
+ "model.layers.16.input_layernorm.weight": "model-00001-of-00002.safetensors",
125
+ "model.layers.16.mlp.fc1.bias": "model-00001-of-00002.safetensors",
126
+ "model.layers.16.mlp.fc1.weight": "model-00001-of-00002.safetensors",
127
+ "model.layers.16.mlp.fc2.bias": "model-00001-of-00002.safetensors",
128
+ "model.layers.16.mlp.fc2.weight": "model-00001-of-00002.safetensors",
129
+ "model.layers.16.self_attn.dense.bias": "model-00001-of-00002.safetensors",
130
+ "model.layers.16.self_attn.dense.weight": "model-00001-of-00002.safetensors",
131
+ "model.layers.16.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
132
+ "model.layers.16.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
133
+ "model.layers.16.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
134
+ "model.layers.16.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
135
+ "model.layers.16.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
136
+ "model.layers.16.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
137
+ "model.layers.17.input_layernorm.bias": "model-00001-of-00002.safetensors",
138
+ "model.layers.17.input_layernorm.weight": "model-00001-of-00002.safetensors",
139
+ "model.layers.17.mlp.fc1.bias": "model-00001-of-00002.safetensors",
140
+ "model.layers.17.mlp.fc1.weight": "model-00001-of-00002.safetensors",
141
+ "model.layers.17.mlp.fc2.bias": "model-00001-of-00002.safetensors",
142
+ "model.layers.17.mlp.fc2.weight": "model-00001-of-00002.safetensors",
143
+ "model.layers.17.self_attn.dense.bias": "model-00001-of-00002.safetensors",
144
+ "model.layers.17.self_attn.dense.weight": "model-00001-of-00002.safetensors",
145
+ "model.layers.17.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
146
+ "model.layers.17.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
147
+ "model.layers.17.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
148
+ "model.layers.17.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
149
+ "model.layers.17.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
150
+ "model.layers.17.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
151
+ "model.layers.18.input_layernorm.bias": "model-00001-of-00002.safetensors",
152
+ "model.layers.18.input_layernorm.weight": "model-00001-of-00002.safetensors",
153
+ "model.layers.18.mlp.fc1.bias": "model-00001-of-00002.safetensors",
154
+ "model.layers.18.mlp.fc1.weight": "model-00001-of-00002.safetensors",
155
+ "model.layers.18.mlp.fc2.bias": "model-00001-of-00002.safetensors",
156
+ "model.layers.18.mlp.fc2.weight": "model-00001-of-00002.safetensors",
157
+ "model.layers.18.self_attn.dense.bias": "model-00001-of-00002.safetensors",
158
+ "model.layers.18.self_attn.dense.weight": "model-00001-of-00002.safetensors",
159
+ "model.layers.18.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
160
+ "model.layers.18.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
161
+ "model.layers.18.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
162
+ "model.layers.18.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
163
+ "model.layers.18.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
164
+ "model.layers.18.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
165
+ "model.layers.19.input_layernorm.bias": "model-00001-of-00002.safetensors",
166
+ "model.layers.19.input_layernorm.weight": "model-00001-of-00002.safetensors",
167
+ "model.layers.19.mlp.fc1.bias": "model-00001-of-00002.safetensors",
168
+ "model.layers.19.mlp.fc1.weight": "model-00001-of-00002.safetensors",
169
+ "model.layers.19.mlp.fc2.bias": "model-00001-of-00002.safetensors",
170
+ "model.layers.19.mlp.fc2.weight": "model-00001-of-00002.safetensors",
171
+ "model.layers.19.self_attn.dense.bias": "model-00001-of-00002.safetensors",
172
+ "model.layers.19.self_attn.dense.weight": "model-00001-of-00002.safetensors",
173
+ "model.layers.19.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
174
+ "model.layers.19.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
175
+ "model.layers.19.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
176
+ "model.layers.19.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
177
+ "model.layers.19.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
178
+ "model.layers.19.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
179
+ "model.layers.2.input_layernorm.bias": "model-00001-of-00002.safetensors",
180
+ "model.layers.2.input_layernorm.weight": "model-00001-of-00002.safetensors",
181
+ "model.layers.2.mlp.fc1.bias": "model-00001-of-00002.safetensors",
182
+ "model.layers.2.mlp.fc1.weight": "model-00001-of-00002.safetensors",
183
+ "model.layers.2.mlp.fc2.bias": "model-00001-of-00002.safetensors",
184
+ "model.layers.2.mlp.fc2.weight": "model-00001-of-00002.safetensors",
185
+ "model.layers.2.self_attn.dense.bias": "model-00001-of-00002.safetensors",
186
+ "model.layers.2.self_attn.dense.weight": "model-00001-of-00002.safetensors",
187
+ "model.layers.2.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
188
+ "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
189
+ "model.layers.2.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
190
+ "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
191
+ "model.layers.2.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
192
+ "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
193
+ "model.layers.20.input_layernorm.bias": "model-00001-of-00002.safetensors",
194
+ "model.layers.20.input_layernorm.weight": "model-00001-of-00002.safetensors",
195
+ "model.layers.20.mlp.fc1.bias": "model-00001-of-00002.safetensors",
196
+ "model.layers.20.mlp.fc1.weight": "model-00001-of-00002.safetensors",
197
+ "model.layers.20.mlp.fc2.bias": "model-00001-of-00002.safetensors",
198
+ "model.layers.20.mlp.fc2.weight": "model-00001-of-00002.safetensors",
199
+ "model.layers.20.self_attn.dense.bias": "model-00001-of-00002.safetensors",
200
+ "model.layers.20.self_attn.dense.weight": "model-00001-of-00002.safetensors",
201
+ "model.layers.20.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
202
+ "model.layers.20.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
203
+ "model.layers.20.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
204
+ "model.layers.20.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
205
+ "model.layers.20.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
206
+ "model.layers.20.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
207
+ "model.layers.21.input_layernorm.bias": "model-00001-of-00002.safetensors",
208
+ "model.layers.21.input_layernorm.weight": "model-00001-of-00002.safetensors",
209
+ "model.layers.21.mlp.fc1.bias": "model-00001-of-00002.safetensors",
210
+ "model.layers.21.mlp.fc1.weight": "model-00001-of-00002.safetensors",
211
+ "model.layers.21.mlp.fc2.bias": "model-00001-of-00002.safetensors",
212
+ "model.layers.21.mlp.fc2.weight": "model-00001-of-00002.safetensors",
213
+ "model.layers.21.self_attn.dense.bias": "model-00001-of-00002.safetensors",
214
+ "model.layers.21.self_attn.dense.weight": "model-00001-of-00002.safetensors",
215
+ "model.layers.21.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
216
+ "model.layers.21.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
217
+ "model.layers.21.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
218
+ "model.layers.21.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
219
+ "model.layers.21.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
220
+ "model.layers.21.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
221
+ "model.layers.22.input_layernorm.bias": "model-00001-of-00002.safetensors",
222
+ "model.layers.22.input_layernorm.weight": "model-00001-of-00002.safetensors",
223
+ "model.layers.22.mlp.fc1.bias": "model-00001-of-00002.safetensors",
224
+ "model.layers.22.mlp.fc1.weight": "model-00001-of-00002.safetensors",
225
+ "model.layers.22.mlp.fc2.bias": "model-00001-of-00002.safetensors",
226
+ "model.layers.22.mlp.fc2.weight": "model-00001-of-00002.safetensors",
227
+ "model.layers.22.self_attn.dense.bias": "model-00001-of-00002.safetensors",
228
+ "model.layers.22.self_attn.dense.weight": "model-00001-of-00002.safetensors",
229
+ "model.layers.22.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
230
+ "model.layers.22.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
231
+ "model.layers.22.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
232
+ "model.layers.22.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
233
+ "model.layers.22.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
234
+ "model.layers.22.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
235
+ "model.layers.23.input_layernorm.bias": "model-00001-of-00002.safetensors",
236
+ "model.layers.23.input_layernorm.weight": "model-00001-of-00002.safetensors",
237
+ "model.layers.23.mlp.fc1.bias": "model-00001-of-00002.safetensors",
238
+ "model.layers.23.mlp.fc1.weight": "model-00001-of-00002.safetensors",
239
+ "model.layers.23.mlp.fc2.bias": "model-00001-of-00002.safetensors",
240
+ "model.layers.23.mlp.fc2.weight": "model-00001-of-00002.safetensors",
241
+ "model.layers.23.self_attn.dense.bias": "model-00001-of-00002.safetensors",
242
+ "model.layers.23.self_attn.dense.weight": "model-00001-of-00002.safetensors",
243
+ "model.layers.23.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
244
+ "model.layers.23.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
245
+ "model.layers.23.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
246
+ "model.layers.23.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
247
+ "model.layers.23.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
248
+ "model.layers.23.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
249
+ "model.layers.24.input_layernorm.bias": "model-00001-of-00002.safetensors",
250
+ "model.layers.24.input_layernorm.weight": "model-00001-of-00002.safetensors",
251
+ "model.layers.24.mlp.fc1.bias": "model-00001-of-00002.safetensors",
252
+ "model.layers.24.mlp.fc1.weight": "model-00001-of-00002.safetensors",
253
+ "model.layers.24.mlp.fc2.bias": "model-00001-of-00002.safetensors",
254
+ "model.layers.24.mlp.fc2.weight": "model-00001-of-00002.safetensors",
255
+ "model.layers.24.self_attn.dense.bias": "model-00001-of-00002.safetensors",
256
+ "model.layers.24.self_attn.dense.weight": "model-00001-of-00002.safetensors",
257
+ "model.layers.24.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
258
+ "model.layers.24.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
259
+ "model.layers.24.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
260
+ "model.layers.24.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
261
+ "model.layers.24.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
262
+ "model.layers.24.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
263
+ "model.layers.25.input_layernorm.bias": "model-00001-of-00002.safetensors",
264
+ "model.layers.25.input_layernorm.weight": "model-00001-of-00002.safetensors",
265
+ "model.layers.25.mlp.fc1.bias": "model-00001-of-00002.safetensors",
266
+ "model.layers.25.mlp.fc1.weight": "model-00001-of-00002.safetensors",
267
+ "model.layers.25.mlp.fc2.bias": "model-00001-of-00002.safetensors",
268
+ "model.layers.25.mlp.fc2.weight": "model-00001-of-00002.safetensors",
269
+ "model.layers.25.self_attn.dense.bias": "model-00001-of-00002.safetensors",
270
+ "model.layers.25.self_attn.dense.weight": "model-00001-of-00002.safetensors",
271
+ "model.layers.25.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
272
+ "model.layers.25.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
273
+ "model.layers.25.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
274
+ "model.layers.25.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
275
+ "model.layers.25.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
276
+ "model.layers.25.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
277
+ "model.layers.26.input_layernorm.bias": "model-00001-of-00002.safetensors",
278
+ "model.layers.26.input_layernorm.weight": "model-00001-of-00002.safetensors",
279
+ "model.layers.26.mlp.fc1.bias": "model-00001-of-00002.safetensors",
280
+ "model.layers.26.mlp.fc1.weight": "model-00001-of-00002.safetensors",
281
+ "model.layers.26.mlp.fc2.bias": "model-00001-of-00002.safetensors",
282
+ "model.layers.26.mlp.fc2.weight": "model-00001-of-00002.safetensors",
283
+ "model.layers.26.self_attn.dense.bias": "model-00001-of-00002.safetensors",
284
+ "model.layers.26.self_attn.dense.weight": "model-00001-of-00002.safetensors",
285
+ "model.layers.26.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
286
+ "model.layers.26.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
287
+ "model.layers.26.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
288
+ "model.layers.26.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
289
+ "model.layers.26.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
290
+ "model.layers.26.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
291
+ "model.layers.27.input_layernorm.bias": "model-00001-of-00002.safetensors",
292
+ "model.layers.27.input_layernorm.weight": "model-00001-of-00002.safetensors",
293
+ "model.layers.27.mlp.fc1.bias": "model-00001-of-00002.safetensors",
294
+ "model.layers.27.mlp.fc1.weight": "model-00001-of-00002.safetensors",
295
+ "model.layers.27.mlp.fc2.bias": "model-00001-of-00002.safetensors",
296
+ "model.layers.27.mlp.fc2.weight": "model-00001-of-00002.safetensors",
297
+ "model.layers.27.self_attn.dense.bias": "model-00001-of-00002.safetensors",
298
+ "model.layers.27.self_attn.dense.weight": "model-00001-of-00002.safetensors",
299
+ "model.layers.27.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
300
+ "model.layers.27.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
301
+ "model.layers.27.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
302
+ "model.layers.27.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
303
+ "model.layers.27.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
304
+ "model.layers.27.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
305
+ "model.layers.28.input_layernorm.bias": "model-00001-of-00002.safetensors",
306
+ "model.layers.28.input_layernorm.weight": "model-00001-of-00002.safetensors",
307
+ "model.layers.28.mlp.fc1.bias": "model-00001-of-00002.safetensors",
308
+ "model.layers.28.mlp.fc1.weight": "model-00001-of-00002.safetensors",
309
+ "model.layers.28.mlp.fc2.bias": "model-00001-of-00002.safetensors",
310
+ "model.layers.28.mlp.fc2.weight": "model-00001-of-00002.safetensors",
311
+ "model.layers.28.self_attn.dense.bias": "model-00001-of-00002.safetensors",
312
+ "model.layers.28.self_attn.dense.weight": "model-00001-of-00002.safetensors",
313
+ "model.layers.28.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
314
+ "model.layers.28.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
315
+ "model.layers.28.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
316
+ "model.layers.28.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
317
+ "model.layers.28.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
318
+ "model.layers.28.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
319
+ "model.layers.29.input_layernorm.bias": "model-00001-of-00002.safetensors",
320
+ "model.layers.29.input_layernorm.weight": "model-00001-of-00002.safetensors",
321
+ "model.layers.29.mlp.fc1.bias": "model-00001-of-00002.safetensors",
322
+ "model.layers.29.mlp.fc1.weight": "model-00001-of-00002.safetensors",
323
+ "model.layers.29.mlp.fc2.bias": "model-00001-of-00002.safetensors",
324
+ "model.layers.29.mlp.fc2.weight": "model-00001-of-00002.safetensors",
325
+ "model.layers.29.self_attn.dense.bias": "model-00001-of-00002.safetensors",
326
+ "model.layers.29.self_attn.dense.weight": "model-00001-of-00002.safetensors",
327
+ "model.layers.29.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
328
+ "model.layers.29.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
329
+ "model.layers.29.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
330
+ "model.layers.29.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
331
+ "model.layers.29.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
332
+ "model.layers.29.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
333
+ "model.layers.3.input_layernorm.bias": "model-00001-of-00002.safetensors",
334
+ "model.layers.3.input_layernorm.weight": "model-00001-of-00002.safetensors",
335
+ "model.layers.3.mlp.fc1.bias": "model-00001-of-00002.safetensors",
336
+ "model.layers.3.mlp.fc1.weight": "model-00001-of-00002.safetensors",
337
+ "model.layers.3.mlp.fc2.bias": "model-00001-of-00002.safetensors",
338
+ "model.layers.3.mlp.fc2.weight": "model-00001-of-00002.safetensors",
339
+ "model.layers.3.self_attn.dense.bias": "model-00001-of-00002.safetensors",
340
+ "model.layers.3.self_attn.dense.weight": "model-00001-of-00002.safetensors",
341
+ "model.layers.3.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
342
+ "model.layers.3.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
343
+ "model.layers.3.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
344
+ "model.layers.3.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
345
+ "model.layers.3.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
346
+ "model.layers.3.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
347
+ "model.layers.30.input_layernorm.bias": "model-00002-of-00002.safetensors",
348
+ "model.layers.30.input_layernorm.weight": "model-00002-of-00002.safetensors",
349
+ "model.layers.30.mlp.fc1.bias": "model-00002-of-00002.safetensors",
350
+ "model.layers.30.mlp.fc1.weight": "model-00002-of-00002.safetensors",
351
+ "model.layers.30.mlp.fc2.bias": "model-00002-of-00002.safetensors",
352
+ "model.layers.30.mlp.fc2.weight": "model-00002-of-00002.safetensors",
353
+ "model.layers.30.self_attn.dense.bias": "model-00002-of-00002.safetensors",
354
+ "model.layers.30.self_attn.dense.weight": "model-00002-of-00002.safetensors",
355
+ "model.layers.30.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
356
+ "model.layers.30.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
357
+ "model.layers.30.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
358
+ "model.layers.30.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
359
+ "model.layers.30.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
360
+ "model.layers.30.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
361
+ "model.layers.31.input_layernorm.bias": "model-00002-of-00002.safetensors",
362
+ "model.layers.31.input_layernorm.weight": "model-00002-of-00002.safetensors",
363
+ "model.layers.31.mlp.fc1.bias": "model-00002-of-00002.safetensors",
364
+ "model.layers.31.mlp.fc1.weight": "model-00002-of-00002.safetensors",
365
+ "model.layers.31.mlp.fc2.bias": "model-00002-of-00002.safetensors",
366
+ "model.layers.31.mlp.fc2.weight": "model-00002-of-00002.safetensors",
367
+ "model.layers.31.self_attn.dense.bias": "model-00002-of-00002.safetensors",
368
+ "model.layers.31.self_attn.dense.weight": "model-00002-of-00002.safetensors",
369
+ "model.layers.31.self_attn.k_proj.bias": "model-00002-of-00002.safetensors",
370
+ "model.layers.31.self_attn.k_proj.weight": "model-00002-of-00002.safetensors",
371
+ "model.layers.31.self_attn.q_proj.bias": "model-00002-of-00002.safetensors",
372
+ "model.layers.31.self_attn.q_proj.weight": "model-00002-of-00002.safetensors",
373
+ "model.layers.31.self_attn.v_proj.bias": "model-00002-of-00002.safetensors",
374
+ "model.layers.31.self_attn.v_proj.weight": "model-00002-of-00002.safetensors",
375
+ "model.layers.4.input_layernorm.bias": "model-00001-of-00002.safetensors",
376
+ "model.layers.4.input_layernorm.weight": "model-00001-of-00002.safetensors",
377
+ "model.layers.4.mlp.fc1.bias": "model-00001-of-00002.safetensors",
378
+ "model.layers.4.mlp.fc1.weight": "model-00001-of-00002.safetensors",
379
+ "model.layers.4.mlp.fc2.bias": "model-00001-of-00002.safetensors",
380
+ "model.layers.4.mlp.fc2.weight": "model-00001-of-00002.safetensors",
381
+ "model.layers.4.self_attn.dense.bias": "model-00001-of-00002.safetensors",
382
+ "model.layers.4.self_attn.dense.weight": "model-00001-of-00002.safetensors",
383
+ "model.layers.4.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
384
+ "model.layers.4.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
385
+ "model.layers.4.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
386
+ "model.layers.4.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
387
+ "model.layers.4.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
388
+ "model.layers.4.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
389
+ "model.layers.5.input_layernorm.bias": "model-00001-of-00002.safetensors",
390
+ "model.layers.5.input_layernorm.weight": "model-00001-of-00002.safetensors",
391
+ "model.layers.5.mlp.fc1.bias": "model-00001-of-00002.safetensors",
392
+ "model.layers.5.mlp.fc1.weight": "model-00001-of-00002.safetensors",
393
+ "model.layers.5.mlp.fc2.bias": "model-00001-of-00002.safetensors",
394
+ "model.layers.5.mlp.fc2.weight": "model-00001-of-00002.safetensors",
395
+ "model.layers.5.self_attn.dense.bias": "model-00001-of-00002.safetensors",
396
+ "model.layers.5.self_attn.dense.weight": "model-00001-of-00002.safetensors",
397
+ "model.layers.5.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
398
+ "model.layers.5.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
399
+ "model.layers.5.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
400
+ "model.layers.5.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
401
+ "model.layers.5.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
402
+ "model.layers.5.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
403
+ "model.layers.6.input_layernorm.bias": "model-00001-of-00002.safetensors",
404
+ "model.layers.6.input_layernorm.weight": "model-00001-of-00002.safetensors",
405
+ "model.layers.6.mlp.fc1.bias": "model-00001-of-00002.safetensors",
406
+ "model.layers.6.mlp.fc1.weight": "model-00001-of-00002.safetensors",
407
+ "model.layers.6.mlp.fc2.bias": "model-00001-of-00002.safetensors",
408
+ "model.layers.6.mlp.fc2.weight": "model-00001-of-00002.safetensors",
409
+ "model.layers.6.self_attn.dense.bias": "model-00001-of-00002.safetensors",
410
+ "model.layers.6.self_attn.dense.weight": "model-00001-of-00002.safetensors",
411
+ "model.layers.6.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
412
+ "model.layers.6.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
413
+ "model.layers.6.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
414
+ "model.layers.6.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
415
+ "model.layers.6.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
416
+ "model.layers.6.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
417
+ "model.layers.7.input_layernorm.bias": "model-00001-of-00002.safetensors",
418
+ "model.layers.7.input_layernorm.weight": "model-00001-of-00002.safetensors",
419
+ "model.layers.7.mlp.fc1.bias": "model-00001-of-00002.safetensors",
420
+ "model.layers.7.mlp.fc1.weight": "model-00001-of-00002.safetensors",
421
+ "model.layers.7.mlp.fc2.bias": "model-00001-of-00002.safetensors",
422
+ "model.layers.7.mlp.fc2.weight": "model-00001-of-00002.safetensors",
423
+ "model.layers.7.self_attn.dense.bias": "model-00001-of-00002.safetensors",
424
+ "model.layers.7.self_attn.dense.weight": "model-00001-of-00002.safetensors",
425
+ "model.layers.7.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
426
+ "model.layers.7.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
427
+ "model.layers.7.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
428
+ "model.layers.7.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
429
+ "model.layers.7.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
430
+ "model.layers.7.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
431
+ "model.layers.8.input_layernorm.bias": "model-00001-of-00002.safetensors",
432
+ "model.layers.8.input_layernorm.weight": "model-00001-of-00002.safetensors",
433
+ "model.layers.8.mlp.fc1.bias": "model-00001-of-00002.safetensors",
434
+ "model.layers.8.mlp.fc1.weight": "model-00001-of-00002.safetensors",
435
+ "model.layers.8.mlp.fc2.bias": "model-00001-of-00002.safetensors",
436
+ "model.layers.8.mlp.fc2.weight": "model-00001-of-00002.safetensors",
437
+ "model.layers.8.self_attn.dense.bias": "model-00001-of-00002.safetensors",
438
+ "model.layers.8.self_attn.dense.weight": "model-00001-of-00002.safetensors",
439
+ "model.layers.8.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
440
+ "model.layers.8.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
441
+ "model.layers.8.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
442
+ "model.layers.8.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
443
+ "model.layers.8.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
444
+ "model.layers.8.self_attn.v_proj.weight": "model-00001-of-00002.safetensors",
445
+ "model.layers.9.input_layernorm.bias": "model-00001-of-00002.safetensors",
446
+ "model.layers.9.input_layernorm.weight": "model-00001-of-00002.safetensors",
447
+ "model.layers.9.mlp.fc1.bias": "model-00001-of-00002.safetensors",
448
+ "model.layers.9.mlp.fc1.weight": "model-00001-of-00002.safetensors",
449
+ "model.layers.9.mlp.fc2.bias": "model-00001-of-00002.safetensors",
450
+ "model.layers.9.mlp.fc2.weight": "model-00001-of-00002.safetensors",
451
+ "model.layers.9.self_attn.dense.bias": "model-00001-of-00002.safetensors",
452
+ "model.layers.9.self_attn.dense.weight": "model-00001-of-00002.safetensors",
453
+ "model.layers.9.self_attn.k_proj.bias": "model-00001-of-00002.safetensors",
454
+ "model.layers.9.self_attn.k_proj.weight": "model-00001-of-00002.safetensors",
455
+ "model.layers.9.self_attn.q_proj.bias": "model-00001-of-00002.safetensors",
456
+ "model.layers.9.self_attn.q_proj.weight": "model-00001-of-00002.safetensors",
457
+ "model.layers.9.self_attn.v_proj.bias": "model-00001-of-00002.safetensors",
458
+ "model.layers.9.self_attn.v_proj.weight": "model-00001-of-00002.safetensors"
459
+ }
460
+ }
tokenizer/tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer/tokenizer_config.json ADDED
@@ -0,0 +1,324 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "50256": {
5
+ "content": "<|endoftext|>",
6
+ "lstrip": false,
7
+ "normalized": false,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "50257": {
13
+ "content": " ",
14
+ "lstrip": false,
15
+ "normalized": true,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": false
19
+ },
20
+ "50258": {
21
+ "content": " ",
22
+ "lstrip": false,
23
+ "normalized": true,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": false
27
+ },
28
+ "50259": {
29
+ "content": " ",
30
+ "lstrip": false,
31
+ "normalized": true,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": false
35
+ },
36
+ "50260": {
37
+ "content": " ",
38
+ "lstrip": false,
39
+ "normalized": true,
40
+ "rstrip": false,
41
+ "single_word": false,
42
+ "special": false
43
+ },
44
+ "50261": {
45
+ "content": " ",
46
+ "lstrip": false,
47
+ "normalized": true,
48
+ "rstrip": false,
49
+ "single_word": false,
50
+ "special": false
51
+ },
52
+ "50262": {
53
+ "content": " ",
54
+ "lstrip": false,
55
+ "normalized": true,
56
+ "rstrip": false,
57
+ "single_word": false,
58
+ "special": false
59
+ },
60
+ "50263": {
61
+ "content": " ",
62
+ "lstrip": false,
63
+ "normalized": true,
64
+ "rstrip": false,
65
+ "single_word": false,
66
+ "special": false
67
+ },
68
+ "50264": {
69
+ "content": " ",
70
+ "lstrip": false,
71
+ "normalized": true,
72
+ "rstrip": false,
73
+ "single_word": false,
74
+ "special": false
75
+ },
76
+ "50265": {
77
+ "content": " ",
78
+ "lstrip": false,
79
+ "normalized": true,
80
+ "rstrip": false,
81
+ "single_word": false,
82
+ "special": false
83
+ },
84
+ "50266": {
85
+ "content": " ",
86
+ "lstrip": false,
87
+ "normalized": true,
88
+ "rstrip": false,
89
+ "single_word": false,
90
+ "special": false
91
+ },
92
+ "50267": {
93
+ "content": " ",
94
+ "lstrip": false,
95
+ "normalized": true,
96
+ "rstrip": false,
97
+ "single_word": false,
98
+ "special": false
99
+ },
100
+ "50268": {
101
+ "content": " ",
102
+ "lstrip": false,
103
+ "normalized": true,
104
+ "rstrip": false,
105
+ "single_word": false,
106
+ "special": false
107
+ },
108
+ "50269": {
109
+ "content": " ",
110
+ "lstrip": false,
111
+ "normalized": true,
112
+ "rstrip": false,
113
+ "single_word": false,
114
+ "special": false
115
+ },
116
+ "50270": {
117
+ "content": " ",
118
+ "lstrip": false,
119
+ "normalized": true,
120
+ "rstrip": false,
121
+ "single_word": false,
122
+ "special": false
123
+ },
124
+ "50271": {
125
+ "content": " ",
126
+ "lstrip": false,
127
+ "normalized": true,
128
+ "rstrip": false,
129
+ "single_word": false,
130
+ "special": false
131
+ },
132
+ "50272": {
133
+ "content": " ",
134
+ "lstrip": false,
135
+ "normalized": true,
136
+ "rstrip": false,
137
+ "single_word": false,
138
+ "special": false
139
+ },
140
+ "50273": {
141
+ "content": " ",
142
+ "lstrip": false,
143
+ "normalized": true,
144
+ "rstrip": false,
145
+ "single_word": false,
146
+ "special": false
147
+ },
148
+ "50274": {
149
+ "content": " ",
150
+ "lstrip": false,
151
+ "normalized": true,
152
+ "rstrip": false,
153
+ "single_word": false,
154
+ "special": false
155
+ },
156
+ "50275": {
157
+ "content": " ",
158
+ "lstrip": false,
159
+ "normalized": true,
160
+ "rstrip": false,
161
+ "single_word": false,
162
+ "special": false
163
+ },
164
+ "50276": {
165
+ "content": " ",
166
+ "lstrip": false,
167
+ "normalized": true,
168
+ "rstrip": false,
169
+ "single_word": false,
170
+ "special": false
171
+ },
172
+ "50277": {
173
+ "content": " ",
174
+ "lstrip": false,
175
+ "normalized": true,
176
+ "rstrip": false,
177
+ "single_word": false,
178
+ "special": false
179
+ },
180
+ "50278": {
181
+ "content": " ",
182
+ "lstrip": false,
183
+ "normalized": true,
184
+ "rstrip": false,
185
+ "single_word": false,
186
+ "special": false
187
+ },
188
+ "50279": {
189
+ "content": " ",
190
+ "lstrip": false,
191
+ "normalized": true,
192
+ "rstrip": false,
193
+ "single_word": false,
194
+ "special": false
195
+ },
196
+ "50280": {
197
+ "content": " ",
198
+ "lstrip": false,
199
+ "normalized": true,
200
+ "rstrip": false,
201
+ "single_word": false,
202
+ "special": false
203
+ },
204
+ "50281": {
205
+ "content": " ",
206
+ "lstrip": false,
207
+ "normalized": true,
208
+ "rstrip": false,
209
+ "single_word": false,
210
+ "special": false
211
+ },
212
+ "50282": {
213
+ "content": " ",
214
+ "lstrip": false,
215
+ "normalized": true,
216
+ "rstrip": false,
217
+ "single_word": false,
218
+ "special": false
219
+ },
220
+ "50283": {
221
+ "content": " ",
222
+ "lstrip": false,
223
+ "normalized": true,
224
+ "rstrip": false,
225
+ "single_word": false,
226
+ "special": false
227
+ },
228
+ "50284": {
229
+ "content": " ",
230
+ "lstrip": false,
231
+ "normalized": true,
232
+ "rstrip": false,
233
+ "single_word": false,
234
+ "special": false
235
+ },
236
+ "50285": {
237
+ "content": " ",
238
+ "lstrip": false,
239
+ "normalized": true,
240
+ "rstrip": false,
241
+ "single_word": false,
242
+ "special": false
243
+ },
244
+ "50286": {
245
+ "content": " ",
246
+ "lstrip": false,
247
+ "normalized": true,
248
+ "rstrip": false,
249
+ "single_word": false,
250
+ "special": false
251
+ },
252
+ "50287": {
253
+ "content": "\t\t\t\t\t\t\t\t\t",
254
+ "lstrip": false,
255
+ "normalized": true,
256
+ "rstrip": false,
257
+ "single_word": false,
258
+ "special": false
259
+ },
260
+ "50288": {
261
+ "content": "\t\t\t\t\t\t\t\t",
262
+ "lstrip": false,
263
+ "normalized": true,
264
+ "rstrip": false,
265
+ "single_word": false,
266
+ "special": false
267
+ },
268
+ "50289": {
269
+ "content": "\t\t\t\t\t\t\t",
270
+ "lstrip": false,
271
+ "normalized": true,
272
+ "rstrip": false,
273
+ "single_word": false,
274
+ "special": false
275
+ },
276
+ "50290": {
277
+ "content": "\t\t\t\t\t\t",
278
+ "lstrip": false,
279
+ "normalized": true,
280
+ "rstrip": false,
281
+ "single_word": false,
282
+ "special": false
283
+ },
284
+ "50291": {
285
+ "content": "\t\t\t\t\t",
286
+ "lstrip": false,
287
+ "normalized": true,
288
+ "rstrip": false,
289
+ "single_word": false,
290
+ "special": false
291
+ },
292
+ "50292": {
293
+ "content": "\t\t\t\t",
294
+ "lstrip": false,
295
+ "normalized": true,
296
+ "rstrip": false,
297
+ "single_word": false,
298
+ "special": false
299
+ },
300
+ "50293": {
301
+ "content": "\t\t\t",
302
+ "lstrip": false,
303
+ "normalized": true,
304
+ "rstrip": false,
305
+ "single_word": false,
306
+ "special": false
307
+ },
308
+ "50294": {
309
+ "content": "\t\t",
310
+ "lstrip": false,
311
+ "normalized": true,
312
+ "rstrip": false,
313
+ "single_word": false,
314
+ "special": false
315
+ }
316
+ },
317
+ "bos_token": "<|endoftext|>",
318
+ "clean_up_tokenization_spaces": true,
319
+ "eos_token": "<|endoftext|>",
320
+ "pad_token": "<|endoftext|>",
321
+ "model_max_length": 256,
322
+ "tokenizer_class": "CodeGenTokenizer",
323
+ "unk_token": "<|endoftext|>"
324
+ }
transformer/__transformer__.py ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright (c) 2024-present, BAAI. All Rights Reserved.
2
+ #
3
+ # Licensed under the Apache License, Version 2.0 (the "License");
4
+ # you may not use this file except in compliance with the License.
5
+ # You may obtain a copy of the License at
6
+ #
7
+ # http://www.apache.org/licenses/LICENSE-2.0
8
+ #
9
+ # Unless required by applicable law or agreed to in writing, software
10
+ # distributed under the License is distributed on an "AS IS" BASIS,
11
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ # See the License for the specific language governing permissions and
13
+ # limitations under the License.
14
+ ##############################################################################
15
+ """Transformer model."""
16
+
17
+ from diffnext.models.transformers.transformer_nova import NOVATransformer3DModel # noqa
transformer/config.json ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "image_dim": 4,
3
+ "image_size": [
4
+ 480,
5
+ 768
6
+ ],
7
+ "image_stride": 8,
8
+ "text_token_dim": 2560,
9
+ "text_token_len": 256,
10
+ "video_mixer_rank": 24,
11
+ "video_base_size": [
12
+ 16,
13
+ 15,
14
+ 24
15
+ ],
16
+ "image_base_size": [
17
+ 30,
18
+ 48
19
+ ],
20
+ "arch": [
21
+ "vit_d16w1024",
22
+ "vit_d32w1024",
23
+ "mlp_d3w1280"
24
+ ]
25
+ }
transformer/diffusion_pytorch_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:21dec394e82e040a4173f8de6369ff8cc84cedb3c041ccf0e85d36c4ed99958d
3
+ size 1291088736
vae/__vae__.py ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright (c) 2024-present, BAAI. All Rights Reserved.
2
+ #
3
+ # Licensed under the Apache License, Version 2.0 (the "License");
4
+ # you may not use this file except in compliance with the License.
5
+ # You may obtain a copy of the License at
6
+ #
7
+ # http://www.apache.org/licenses/LICENSE-2.0
8
+ #
9
+ # Unless required by applicable law or agreed to in writing, software
10
+ # distributed under the License is distributed on an "AS IS" BASIS,
11
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
+ # See the License for the specific language governing permissions and
13
+ # limitations under the License.
14
+ ##############################################################################
15
+ """VAE model."""
16
+
17
+ from diffnext.models.autoencoders.autoencoder_kl_opensora import AutoencoderKLOpenSora # noqa
vae/config.json ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_class_name": "AutoencoderKLOpenSora",
3
+ "act_fn": "silu",
4
+ "block_out_channels": [
5
+ 128,
6
+ 256,
7
+ 512,
8
+ 512
9
+ ],
10
+ "down_block_types": [
11
+ "DownEncoderBlock2D",
12
+ "DownEncoderBlock2D",
13
+ "DownEncoderBlock3D",
14
+ "DownEncoderBlock3D"
15
+ ],
16
+ "in_channels": 3,
17
+ "latent_channels": 4,
18
+ "layers_per_block": 2,
19
+ "norm_num_groups": 32,
20
+ "out_channels": 3,
21
+ "sample_size": 480,
22
+ "scaling_factor": 0.18215,
23
+ "up_block_types": [
24
+ "UpDecoderBlock3D",
25
+ "UpDecoderBlock3D",
26
+ "UpDecoderBlock3D",
27
+ "UpDecoderBlock3D"
28
+ ]
29
+ }
vae/diffusion_pytorch_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1e860e79354462b9f3df3b2afd1a35ef258b986c9f01f1497f9acfec4c0da046
3
+ size 478407926