TheBloke commited on
Commit
288acf3
1 Parent(s): 5b9d2ec

AWQ model commit

Browse files
LICENSE.txt ADDED
@@ -0,0 +1,126 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ LLAMA 2 COMMUNITY LICENSE AGREEMENT
2
+ Llama 2 Version Release Date: July 18, 2023
3
+
4
+ "Agreement" means the terms and conditions for use, reproduction, distribution and
5
+ modification of the Llama Materials set forth herein.
6
+
7
+ "Documentation" means the specifications, manuals and documentation
8
+ accompanying Llama 2 distributed by Meta at ai.meta.com/resources/models-and-
9
+ libraries/llama-downloads/.
10
+
11
+ "Licensee" or "you" means you, or your employer or any other person or entity (if
12
+ you are entering into this Agreement on such person or entity's behalf), of the age
13
+ required under applicable laws, rules or regulations to provide legal consent and that
14
+ has legal authority to bind your employer or such other person or entity if you are
15
+ entering in this Agreement on their behalf.
16
+
17
+ "Llama 2" means the foundational large language models and software and
18
+ algorithms, including machine-learning model code, trained model weights,
19
+ inference-enabling code, training-enabling code, fine-tuning enabling code and other
20
+ elements of the foregoing distributed by Meta at ai.meta.com/resources/models-and-
21
+ libraries/llama-downloads/.
22
+
23
+ "Llama Materials" means, collectively, Meta's proprietary Llama 2 and
24
+ Documentation (and any portion thereof) made available under this Agreement.
25
+
26
+ "Meta" or "we" means Meta Platforms Ireland Limited (if you are located in or, if you
27
+ are an entity, your principal place of business is in the EEA or Switzerland) and Meta
28
+ Platforms, Inc. (if you are located outside of the EEA or Switzerland).
29
+
30
+ By clicking "I Accept" below or by using or distributing any portion or element of the
31
+ Llama Materials, you agree to be bound by this Agreement.
32
+
33
+ 1. License Rights and Redistribution.
34
+
35
+ a. Grant of Rights. You are granted a non-exclusive, worldwide, non-
36
+ transferable and royalty-free limited license under Meta's intellectual property or
37
+ other rights owned by Meta embodied in the Llama Materials to use, reproduce,
38
+ distribute, copy, create derivative works of, and make modifications to the Llama
39
+ Materials.
40
+
41
+ b. Redistribution and Use.
42
+
43
+ i. If you distribute or make the Llama Materials, or any derivative works
44
+ thereof, available to a third party, you shall provide a copy of this Agreement to such
45
+ third party.
46
+ ii. If you receive Llama Materials, or any derivative works thereof, from
47
+ a Licensee as part of an integrated end user product, then Section 2 of this
48
+ Agreement will not apply to you.
49
+
50
+ iii. You must retain in all copies of the Llama Materials that you
51
+ distribute the following attribution notice within a "Notice" text file distributed as a
52
+ part of such copies: "Llama 2 is licensed under the LLAMA 2 Community License,
53
+ Copyright (c) Meta Platforms, Inc. All Rights Reserved."
54
+
55
+ iv. Your use of the Llama Materials must comply with applicable laws
56
+ and regulations (including trade compliance laws and regulations) and adhere to the
57
+ Acceptable Use Policy for the Llama Materials (available at
58
+ https://ai.meta.com/llama/use-policy), which is hereby incorporated by reference into
59
+ this Agreement.
60
+
61
+ v. You will not use the Llama Materials or any output or results of the
62
+ Llama Materials to improve any other large language model (excluding Llama 2 or
63
+ derivative works thereof).
64
+
65
+ 2. Additional Commercial Terms. If, on the Llama 2 version release date, the
66
+ monthly active users of the products or services made available by or for Licensee,
67
+ or Licensee's affiliates, is greater than 700 million monthly active users in the
68
+ preceding calendar month, you must request a license from Meta, which Meta may
69
+ grant to you in its sole discretion, and you are not authorized to exercise any of the
70
+ rights under this Agreement unless or until Meta otherwise expressly grants you
71
+ such rights.
72
+
73
+ 3. Disclaimer of Warranty. UNLESS REQUIRED BY APPLICABLE LAW, THE
74
+ LLAMA MATERIALS AND ANY OUTPUT AND RESULTS THEREFROM ARE
75
+ PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
76
+ EITHER EXPRESS OR IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY
77
+ WARRANTIES OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY, OR
78
+ FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE
79
+ FOR DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING
80
+ THE LLAMA MATERIALS AND ASSUME ANY RISKS ASSOCIATED WITH YOUR
81
+ USE OF THE LLAMA MATERIALS AND ANY OUTPUT AND RESULTS.
82
+
83
+ 4. Limitation of Liability. IN NO EVENT WILL META OR ITS AFFILIATES BE
84
+ LIABLE UNDER ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, TORT,
85
+ NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS
86
+ AGREEMENT, FOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL,
87
+ CONSEQUENTIAL, INCIDENTAL, EXEMPLARY OR PUNITIVE DAMAGES, EVEN
88
+ IF META OR ITS AFFILIATES HAVE BEEN ADVISED OF THE POSSIBILITY OF
89
+ ANY OF THE FOREGOING.
90
+
91
+ 5. Intellectual Property.
92
+
93
+ a. No trademark licenses are granted under this Agreement, and in
94
+ connection with the Llama Materials, neither Meta nor Licensee may use any name
95
+ or mark owned by or associated with the other or any of its affiliates, except as
96
+ required for reasonable and customary use in describing and redistributing the
97
+ Llama Materials.
98
+
99
+ b. Subject to Meta's ownership of Llama Materials and derivatives made by or
100
+ for Meta, with respect to any derivative works and modifications of the Llama
101
+ Materials that are made by you, as between you and Meta, you are and will be the
102
+ owner of such derivative works and modifications.
103
+
104
+ c. If you institute litigation or other proceedings against Meta or any entity
105
+ (including a cross-claim or counterclaim in a lawsuit) alleging that the Llama
106
+ Materials or Llama 2 outputs or results, or any portion of any of the foregoing,
107
+ constitutes infringement of intellectual property or other rights owned or licensable
108
+ by you, then any licenses granted to you under this Agreement shall terminate as of
109
+ the date such litigation or claim is filed or instituted. You will indemnify and hold
110
+ harmless Meta from and against any claim by any third party arising out of or related
111
+ to your use or distribution of the Llama Materials.
112
+
113
+ 6. Term and Termination. The term of this Agreement will commence upon your
114
+ acceptance of this Agreement or access to the Llama Materials and will continue in
115
+ full force and effect until terminated in accordance with the terms and conditions
116
+ herein. Meta may terminate this Agreement if you are in breach of any term or
117
+ condition of this Agreement. Upon termination of this Agreement, you shall delete
118
+ and cease use of the Llama Materials. Sections 3, 4 and 7 shall survive the
119
+ termination of this Agreement.
120
+
121
+ 7. Governing Law and Jurisdiction. This Agreement will be governed and
122
+ construed under the laws of the State of California without regard to choice of law
123
+ principles, and the UN Convention on Contracts for the International Sale of Goods
124
+ does not apply to this Agreement. The courts of California shall have exclusive
125
+ jurisdiction of any dispute arising out of this Agreement.
126
+
MODEL_LICENSE.md ADDED
@@ -0,0 +1,47 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # CodeFuse COMMUNITY LICENSE AGREEMENT
2
+ CodeFuse Release Date: September 8, 2023
3
+
4
+ By clicking to agree or by using or distributing any portion or element of the Materials, you will be deemed to have recognized and accepted the content of this Agreement, which is effective immediately.
5
+
6
+ 1. Definitions.
7
+ a. This CodeFuse COMMUNITY LICENSE AGREEMENT (this "Agreement") shall mean the terms and conditions for use, reproduction, distribution and modification of the Materials as defined by this Agreement.
8
+ b. "Ant" or "We" (or "Us") shall mean Ant Group.
9
+ c. "CodeFuse" shall mean the large language models (including CodeFuse-13B and CodeFuse-CodeLlaMa-34B), and software and algorithms, consisting of trained model weights, parameters (including optimizer states), machine-learning model code, and other elements of the foregoing distributed by Us.
10
+ d. "Documentation" shall mean the specifications, manuals and documentation accompanying CodeFuse distributed by Us.
11
+ e. "Materials" shall mean, collectively, Ant's proprietary CodeFuse and Documentation (and any portion thereof) made available under this Agreement.
12
+ f. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.
13
+ g. "Source" form shall mean the preferred form for making modifications, including but not limited to model source code, documentation source, and configuration files.
14
+ h. "Third Parties" (or "Third Party") shall mean individuals or legal entities that are not controlling, controlled by Us or You, or under common control with Us or You.
15
+ i. "You" (or "Your") shall mean a natural person or legal entity exercising the rights granted by this Agreement and/or using the Materials for any purpose and in any field of use.
16
+
17
+ 2. Grant of Rights.
18
+ You are granted a non-exclusive, worldwide, non-transferable and royalty-free limited license under Ant's intellectual property or other rights owned by Ant embodied in the Materials to use, reproduce, distribute, copy, create derivative works of, and make modifications to the Materials.
19
+
20
+ 3. Redistribution.
21
+ You may distribute or make the Materials or derivative works thereof available to a Third Party in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:
22
+ a. You shall provide a copy of this Agreement to such Third Party;
23
+ b. if You modify the CodeFuse model, You shall provide a prominent notice, stating how You have modified the CodeFuse model, to such Third Party; and
24
+ c. You shall retain in all copies of the Materials that You distribute the following attribution notices within a "Notice" text file distributed as a part of such copies: "CodeFuse is licensed under the CodeFuse COMMUNITY LICENSE AGREEMENT, Copyright (c) Ant Group. All Rights Reserved."
25
+ You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such derivative works as a whole, provided Your use, reproduction, and distribution of the work otherwise complies with the terms and conditions of this Agreement.
26
+
27
+ 4. Rules of Use.
28
+ You shall comply with applicable laws and regulations (including without limitation export controls or restrictions) in Your use of the Materials.
29
+
30
+ 5. Intellectual Property.
31
+ a. Ant retains ownership of all intellectual property rights in and to the Materials and derivatives made by or for Ant. Conditioned upon compliance with the terms and conditions of this Agreement, with respect to any derivative works and modifications of the Materials that are made by You, You are and will be the owner of such derivative works and modifications.
32
+ b. No trademark license is granted to use the trade names, trademarks, service marks, or product names of Ant, except as required to fulfill notice requirements under this Agreement or as required for reasonable and customary use in describing and redistributing the Materials.
33
+ c. If You commence a lawsuit or other proceedings (including a cross-claim or counterclaim in a lawsuit) against Ant or any entity alleging that the Materials or any output therefrom, or any part of the foregoing, infringe any intellectual property or other right owned or licensable by You, then all licences granted to You under this Agreement shall terminate as of the date such lawsuit or other proceeding is commenced or brought.
34
+
35
+ 6. Disclaimer of Warranty and Limitation of Liability.
36
+ a. Ant is not obligated to support, update, provide training for, or develop any further version of the Materials or to grant any license thereto.
37
+ b. THE MATERIALS ARE PROVIDED "AS IS" WITHOUT ANY EXPRESS OR IMPLIED WARRANTY OF ANY KIND INCLUDING WARRANTIES OF TITLE, MERCHANTABILITY, NONINFRINGEMENT, OR FITNESS FOR A PARTICULAR PURPOSE. WE MAKE NO WARRANTY AND ASSUME NO RESPONSIBILITY FOR THE SAFETY OR STABILITY OF THE MATERIALS AND ANY OUTPUT THEREFROM.
38
+ c. YOU ARE SOLELY RESPONSIBLE FOR DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE MATERIALS AND ASSUME ANY RISKS ASSOCIATED WITH YOUR USE OF THE MATERIALS AND ANY OUTPUT AND RESULTS. IN NO EVENT SHALL WE BE LIABLE TO YOU FOR ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS AGREEMENT OR ARISING FROM YOUR USE OR INABILITY TO USE THE MATERIALS OR ANY OUTPUT OF IT, FOR ANY DIRECT, OR INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL, EXEMPLARY OR PUNITIVE DAMAGES, NO MATTER HOW IT'S CAUSED OR EVEN IF ANT OR ITS AFFILIATES HAVE BEEN ADVISED OF THE POSSIBILITY OF ANY OF THE FOREGOING.
39
+ d. You will defend, indemnify and hold harmless Ant from and against any claim by any Third Party arising out of or related to Your use or distribution of the Materials.
40
+
41
+ 7. Survival and Termination.
42
+ a. The term of this Agreement shall commence upon Your acceptance of this Agreement or access to the Materials and will continue in full force and effect until terminated in accordance with the terms and conditions herein.
43
+ b. We may terminate this Agreement if You breach any of the terms or conditions of this Agreement. Upon termination of this Agreement, You must delete and cease use of the Materials. Sections 6 and 8 shall survive the termination of this Agreement.
44
+
45
+ 8. Governing Law and Jurisdiction.
46
+ a. This Agreement and any dispute arising out of or relating to it, whether in contract, tort, negligence, products liability, or otherwise, will be governed by the laws of China, without regard to conflict of law principles, and the UN Convention on Contracts for the International Sale of Goods does not apply to this Agreement.
47
+ b. The People's Courts in Hangzhou City shall have exclusive jurisdiction over any dispute arising out of this Agreement.
Notice ADDED
@@ -0,0 +1 @@
 
 
1
+ Llama 2 is licensed under the LLAMA 2 Community License, Copyright © Meta Platforms, Inc. All Rights Reserved.
USE_POLICY.md ADDED
@@ -0,0 +1,50 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Llama 2 Acceptable Use Policy
2
+
3
+ Meta is committed to promoting safe and fair use of its tools and features, including Llama 2. If you access or use Llama 2, you agree to this Acceptable Use Policy (“Policy”). The most recent copy of this policy can be found at [ai.meta.com/llama/use-policy](http://ai.meta.com/llama/use-policy).
4
+
5
+ ## Prohibited Uses
6
+ We want everyone to use Llama 2 safely and responsibly. You agree you will not use, or allow others to use, Llama 2 to:
7
+
8
+ 1. Violate the law or others’ rights, including to:
9
+ 1. Engage in, promote, generate, contribute to, encourage, plan, incite, or further illegal or unlawful activity or content, such as:
10
+ 1. Violence or terrorism
11
+ 2. Exploitation or harm to children, including the solicitation, creation, acquisition, or dissemination of child exploitative content or failure to report Child Sexual Abuse Material
12
+ 3. Human trafficking, exploitation, and sexual violence
13
+ 4. The illegal distribution of information or materials to minors, including obscene materials, or failure to employ legally required age-gating in connection with such information or materials.
14
+ 5. Sexual solicitation
15
+ 6. Any other criminal activity
16
+ 2. Engage in, promote, incite, or facilitate the harassment, abuse, threatening, or bullying of individuals or groups of individuals
17
+ 3. Engage in, promote, incite, or facilitate discrimination or other unlawful or harmful conduct in the provision of employment, employment benefits, credit, housing, other economic benefits, or other essential goods and services
18
+ 4. Engage in the unauthorized or unlicensed practice of any profession including, but not limited to, financial, legal, medical/health, or related professional practices
19
+ 5. Collect, process, disclose, generate, or infer health, demographic, or other sensitive personal or private information about individuals without rights and consents required by applicable laws
20
+ 6. Engage in or facilitate any action or generate any content that infringes, misappropriates, or otherwise violates any third-party rights, including the outputs or results of any products or services using the Llama 2 Materials
21
+ 7. Create, generate, or facilitate the creation of malicious code, malware, computer viruses or do anything else that could disable, overburden, interfere with or impair the proper working, integrity, operation or appearance of a website or computer system
22
+
23
+
24
+
25
+ 2. Engage in, promote, incite, facilitate, or assist in the planning or development of activities that present a risk of death or bodily harm to individuals, including use of Llama 2 related to the following:
26
+ 1. Military, warfare, nuclear industries or applications, espionage, use for materials or activities that are subject to the International Traffic Arms Regulations (ITAR) maintained by the United States Department of State
27
+ 2. Guns and illegal weapons (including weapon development)
28
+ 3. Illegal drugs and regulated/controlled substances
29
+ 4. Operation of critical infrastructure, transportation technologies, or heavy machinery
30
+ 5. Self-harm or harm to others, including suicide, cutting, and eating disorders
31
+ 6. Any content intended to incite or promote violence, abuse, or any infliction of bodily harm to an individual
32
+
33
+
34
+
35
+ 3. Intentionally deceive or mislead others, including use of Llama 2 related to the following:
36
+ 1. Generating, promoting, or furthering fraud or the creation or promotion of disinformation
37
+ 2. Generating, promoting, or furthering defamatory content, including the creation of defamatory statements, images, or other content
38
+ 3. Generating, promoting, or further distributing spam
39
+ 4. Impersonating another individual without consent, authorization, or legal right
40
+ 5. Representing that the use of Llama 2 or outputs are human-generated
41
+ 6. Generating or facilitating false online engagement, including fake reviews and other means of fake online engagement
42
+ 4. Fail to appropriately disclose to end users any known dangers of your AI system
43
+
44
+ Please report any violation of this Policy, software “bug,” or other problems that could lead to a violation of this Policy through one of the following means:
45
+
46
+ * Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama)
47
+ * Reporting risky content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)
48
+ * Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)
49
+ * Reporting violations of the Acceptable Use Policy or unlicensed uses of Llama: [LlamaUseReport@meta.com](mailto:LlamaUseReport@meta.com)
50
+
config.json ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "/mnt/user/qumu/download_models/codellama/CodeLlama-34b-Python-hf",
3
+ "architectures": [
4
+ "LlamaForCausalLM"
5
+ ],
6
+ "bos_token_id": 1,
7
+ "eos_token": "</s>",
8
+ "eos_token_id": 2,
9
+ "hidden_act": "silu",
10
+ "hidden_size": 8192,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 22016,
13
+ "max_position_embeddings": 16384,
14
+ "model_type": "llama",
15
+ "num_attention_heads": 64,
16
+ "num_hidden_layers": 48,
17
+ "num_key_value_heads": 8,
18
+ "pad_token": "<unk>",
19
+ "pad_token_id": 0,
20
+ "pretraining_tp": 1,
21
+ "rms_norm_eps": 1e-05,
22
+ "rope_scaling": null,
23
+ "rope_theta": 1000000,
24
+ "tie_word_embeddings": false,
25
+ "torch_dtype": "bfloat16",
26
+ "transformers_version": "4.32.0",
27
+ "use_cache": true,
28
+ "vocab_size": 32000
29
+ }
generation_config.json ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 1,
4
+ "eos_token_id": 2,
5
+ "transformers_version": "4.32.0"
6
+ }
model-00001-of-00002.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:65d5749a4d6c43dd1c66dfad4d5ad44e27bd6adc96bd0bdfdd13259a17381adf
3
+ size 9951875168
model-00002-of-00002.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cf42bf3cbe183a997eb6304033152207d59c264182a2889a0411b820952f638c
3
+ size 8356666776
model.safetensors.index.json ADDED
@@ -0,0 +1,1114 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 18308415488
4
+ },
5
+ "weight_map": {
6
+ "model.embed_tokens.weight": "model-00001-of-00002.safetensors",
7
+ "model.layers.0.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
8
+ "model.layers.0.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
9
+ "model.layers.0.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
10
+ "model.layers.0.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
11
+ "model.layers.0.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
12
+ "model.layers.0.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
13
+ "model.layers.0.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
14
+ "model.layers.0.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
15
+ "model.layers.0.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
16
+ "model.layers.0.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
17
+ "model.layers.0.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
18
+ "model.layers.0.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
19
+ "model.layers.0.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
20
+ "model.layers.0.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
21
+ "model.layers.0.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
22
+ "model.layers.0.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
23
+ "model.layers.0.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
24
+ "model.layers.0.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
25
+ "model.layers.0.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
26
+ "model.layers.0.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
27
+ "model.layers.0.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
28
+ "model.layers.0.input_layernorm.weight": "model-00001-of-00002.safetensors",
29
+ "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
30
+ "model.layers.1.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
31
+ "model.layers.1.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
32
+ "model.layers.1.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
33
+ "model.layers.1.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
34
+ "model.layers.1.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
35
+ "model.layers.1.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
36
+ "model.layers.1.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
37
+ "model.layers.1.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
38
+ "model.layers.1.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
39
+ "model.layers.1.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
40
+ "model.layers.1.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
41
+ "model.layers.1.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
42
+ "model.layers.1.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
43
+ "model.layers.1.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
44
+ "model.layers.1.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
45
+ "model.layers.1.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
46
+ "model.layers.1.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
47
+ "model.layers.1.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
48
+ "model.layers.1.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
49
+ "model.layers.1.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
50
+ "model.layers.1.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
51
+ "model.layers.1.input_layernorm.weight": "model-00001-of-00002.safetensors",
52
+ "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
53
+ "model.layers.2.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
54
+ "model.layers.2.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
55
+ "model.layers.2.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
56
+ "model.layers.2.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
57
+ "model.layers.2.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
58
+ "model.layers.2.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
59
+ "model.layers.2.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
60
+ "model.layers.2.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
61
+ "model.layers.2.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
62
+ "model.layers.2.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
63
+ "model.layers.2.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
64
+ "model.layers.2.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
65
+ "model.layers.2.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
66
+ "model.layers.2.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
67
+ "model.layers.2.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
68
+ "model.layers.2.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
69
+ "model.layers.2.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
70
+ "model.layers.2.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
71
+ "model.layers.2.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
72
+ "model.layers.2.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
73
+ "model.layers.2.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
74
+ "model.layers.2.input_layernorm.weight": "model-00001-of-00002.safetensors",
75
+ "model.layers.2.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
76
+ "model.layers.3.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
77
+ "model.layers.3.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
78
+ "model.layers.3.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
79
+ "model.layers.3.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
80
+ "model.layers.3.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
81
+ "model.layers.3.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
82
+ "model.layers.3.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
83
+ "model.layers.3.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
84
+ "model.layers.3.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
85
+ "model.layers.3.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
86
+ "model.layers.3.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
87
+ "model.layers.3.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
88
+ "model.layers.3.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
89
+ "model.layers.3.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
90
+ "model.layers.3.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
91
+ "model.layers.3.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
92
+ "model.layers.3.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
93
+ "model.layers.3.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
94
+ "model.layers.3.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
95
+ "model.layers.3.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
96
+ "model.layers.3.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
97
+ "model.layers.3.input_layernorm.weight": "model-00001-of-00002.safetensors",
98
+ "model.layers.3.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
99
+ "model.layers.4.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
100
+ "model.layers.4.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
101
+ "model.layers.4.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
102
+ "model.layers.4.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
103
+ "model.layers.4.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
104
+ "model.layers.4.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
105
+ "model.layers.4.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
106
+ "model.layers.4.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
107
+ "model.layers.4.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
108
+ "model.layers.4.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
109
+ "model.layers.4.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
110
+ "model.layers.4.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
111
+ "model.layers.4.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
112
+ "model.layers.4.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
113
+ "model.layers.4.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
114
+ "model.layers.4.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
115
+ "model.layers.4.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
116
+ "model.layers.4.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
117
+ "model.layers.4.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
118
+ "model.layers.4.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
119
+ "model.layers.4.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
120
+ "model.layers.4.input_layernorm.weight": "model-00001-of-00002.safetensors",
121
+ "model.layers.4.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
122
+ "model.layers.5.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
123
+ "model.layers.5.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
124
+ "model.layers.5.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
125
+ "model.layers.5.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
126
+ "model.layers.5.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
127
+ "model.layers.5.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
128
+ "model.layers.5.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
129
+ "model.layers.5.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
130
+ "model.layers.5.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
131
+ "model.layers.5.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
132
+ "model.layers.5.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
133
+ "model.layers.5.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
134
+ "model.layers.5.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
135
+ "model.layers.5.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
136
+ "model.layers.5.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
137
+ "model.layers.5.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
138
+ "model.layers.5.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
139
+ "model.layers.5.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
140
+ "model.layers.5.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
141
+ "model.layers.5.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
142
+ "model.layers.5.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
143
+ "model.layers.5.input_layernorm.weight": "model-00001-of-00002.safetensors",
144
+ "model.layers.5.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
145
+ "model.layers.6.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
146
+ "model.layers.6.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
147
+ "model.layers.6.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
148
+ "model.layers.6.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
149
+ "model.layers.6.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
150
+ "model.layers.6.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
151
+ "model.layers.6.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
152
+ "model.layers.6.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
153
+ "model.layers.6.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
154
+ "model.layers.6.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
155
+ "model.layers.6.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
156
+ "model.layers.6.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
157
+ "model.layers.6.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
158
+ "model.layers.6.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
159
+ "model.layers.6.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
160
+ "model.layers.6.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
161
+ "model.layers.6.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
162
+ "model.layers.6.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
163
+ "model.layers.6.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
164
+ "model.layers.6.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
165
+ "model.layers.6.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
166
+ "model.layers.6.input_layernorm.weight": "model-00001-of-00002.safetensors",
167
+ "model.layers.6.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
168
+ "model.layers.7.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
169
+ "model.layers.7.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
170
+ "model.layers.7.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
171
+ "model.layers.7.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
172
+ "model.layers.7.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
173
+ "model.layers.7.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
174
+ "model.layers.7.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
175
+ "model.layers.7.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
176
+ "model.layers.7.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
177
+ "model.layers.7.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
178
+ "model.layers.7.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
179
+ "model.layers.7.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
180
+ "model.layers.7.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
181
+ "model.layers.7.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
182
+ "model.layers.7.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
183
+ "model.layers.7.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
184
+ "model.layers.7.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
185
+ "model.layers.7.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
186
+ "model.layers.7.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
187
+ "model.layers.7.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
188
+ "model.layers.7.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
189
+ "model.layers.7.input_layernorm.weight": "model-00001-of-00002.safetensors",
190
+ "model.layers.7.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
191
+ "model.layers.8.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
192
+ "model.layers.8.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
193
+ "model.layers.8.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
194
+ "model.layers.8.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
195
+ "model.layers.8.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
196
+ "model.layers.8.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
197
+ "model.layers.8.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
198
+ "model.layers.8.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
199
+ "model.layers.8.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
200
+ "model.layers.8.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
201
+ "model.layers.8.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
202
+ "model.layers.8.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
203
+ "model.layers.8.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
204
+ "model.layers.8.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
205
+ "model.layers.8.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
206
+ "model.layers.8.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
207
+ "model.layers.8.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
208
+ "model.layers.8.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
209
+ "model.layers.8.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
210
+ "model.layers.8.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
211
+ "model.layers.8.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
212
+ "model.layers.8.input_layernorm.weight": "model-00001-of-00002.safetensors",
213
+ "model.layers.8.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
214
+ "model.layers.9.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
215
+ "model.layers.9.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
216
+ "model.layers.9.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
217
+ "model.layers.9.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
218
+ "model.layers.9.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
219
+ "model.layers.9.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
220
+ "model.layers.9.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
221
+ "model.layers.9.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
222
+ "model.layers.9.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
223
+ "model.layers.9.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
224
+ "model.layers.9.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
225
+ "model.layers.9.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
226
+ "model.layers.9.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
227
+ "model.layers.9.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
228
+ "model.layers.9.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
229
+ "model.layers.9.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
230
+ "model.layers.9.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
231
+ "model.layers.9.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
232
+ "model.layers.9.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
233
+ "model.layers.9.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
234
+ "model.layers.9.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
235
+ "model.layers.9.input_layernorm.weight": "model-00001-of-00002.safetensors",
236
+ "model.layers.9.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
237
+ "model.layers.10.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
238
+ "model.layers.10.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
239
+ "model.layers.10.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
240
+ "model.layers.10.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
241
+ "model.layers.10.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
242
+ "model.layers.10.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
243
+ "model.layers.10.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
244
+ "model.layers.10.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
245
+ "model.layers.10.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
246
+ "model.layers.10.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
247
+ "model.layers.10.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
248
+ "model.layers.10.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
249
+ "model.layers.10.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
250
+ "model.layers.10.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
251
+ "model.layers.10.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
252
+ "model.layers.10.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
253
+ "model.layers.10.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
254
+ "model.layers.10.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
255
+ "model.layers.10.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
256
+ "model.layers.10.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
257
+ "model.layers.10.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
258
+ "model.layers.10.input_layernorm.weight": "model-00001-of-00002.safetensors",
259
+ "model.layers.10.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
260
+ "model.layers.11.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
261
+ "model.layers.11.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
262
+ "model.layers.11.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
263
+ "model.layers.11.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
264
+ "model.layers.11.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
265
+ "model.layers.11.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
266
+ "model.layers.11.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
267
+ "model.layers.11.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
268
+ "model.layers.11.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
269
+ "model.layers.11.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
270
+ "model.layers.11.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
271
+ "model.layers.11.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
272
+ "model.layers.11.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
273
+ "model.layers.11.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
274
+ "model.layers.11.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
275
+ "model.layers.11.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
276
+ "model.layers.11.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
277
+ "model.layers.11.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
278
+ "model.layers.11.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
279
+ "model.layers.11.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
280
+ "model.layers.11.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
281
+ "model.layers.11.input_layernorm.weight": "model-00001-of-00002.safetensors",
282
+ "model.layers.11.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
283
+ "model.layers.12.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
284
+ "model.layers.12.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
285
+ "model.layers.12.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
286
+ "model.layers.12.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
287
+ "model.layers.12.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
288
+ "model.layers.12.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
289
+ "model.layers.12.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
290
+ "model.layers.12.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
291
+ "model.layers.12.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
292
+ "model.layers.12.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
293
+ "model.layers.12.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
294
+ "model.layers.12.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
295
+ "model.layers.12.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
296
+ "model.layers.12.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
297
+ "model.layers.12.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
298
+ "model.layers.12.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
299
+ "model.layers.12.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
300
+ "model.layers.12.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
301
+ "model.layers.12.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
302
+ "model.layers.12.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
303
+ "model.layers.12.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
304
+ "model.layers.12.input_layernorm.weight": "model-00001-of-00002.safetensors",
305
+ "model.layers.12.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
306
+ "model.layers.13.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
307
+ "model.layers.13.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
308
+ "model.layers.13.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
309
+ "model.layers.13.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
310
+ "model.layers.13.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
311
+ "model.layers.13.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
312
+ "model.layers.13.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
313
+ "model.layers.13.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
314
+ "model.layers.13.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
315
+ "model.layers.13.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
316
+ "model.layers.13.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
317
+ "model.layers.13.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
318
+ "model.layers.13.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
319
+ "model.layers.13.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
320
+ "model.layers.13.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
321
+ "model.layers.13.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
322
+ "model.layers.13.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
323
+ "model.layers.13.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
324
+ "model.layers.13.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
325
+ "model.layers.13.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
326
+ "model.layers.13.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
327
+ "model.layers.13.input_layernorm.weight": "model-00001-of-00002.safetensors",
328
+ "model.layers.13.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
329
+ "model.layers.14.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
330
+ "model.layers.14.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
331
+ "model.layers.14.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
332
+ "model.layers.14.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
333
+ "model.layers.14.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
334
+ "model.layers.14.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
335
+ "model.layers.14.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
336
+ "model.layers.14.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
337
+ "model.layers.14.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
338
+ "model.layers.14.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
339
+ "model.layers.14.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
340
+ "model.layers.14.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
341
+ "model.layers.14.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
342
+ "model.layers.14.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
343
+ "model.layers.14.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
344
+ "model.layers.14.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
345
+ "model.layers.14.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
346
+ "model.layers.14.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
347
+ "model.layers.14.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
348
+ "model.layers.14.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
349
+ "model.layers.14.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
350
+ "model.layers.14.input_layernorm.weight": "model-00001-of-00002.safetensors",
351
+ "model.layers.14.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
352
+ "model.layers.15.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
353
+ "model.layers.15.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
354
+ "model.layers.15.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
355
+ "model.layers.15.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
356
+ "model.layers.15.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
357
+ "model.layers.15.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
358
+ "model.layers.15.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
359
+ "model.layers.15.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
360
+ "model.layers.15.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
361
+ "model.layers.15.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
362
+ "model.layers.15.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
363
+ "model.layers.15.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
364
+ "model.layers.15.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
365
+ "model.layers.15.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
366
+ "model.layers.15.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
367
+ "model.layers.15.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
368
+ "model.layers.15.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
369
+ "model.layers.15.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
370
+ "model.layers.15.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
371
+ "model.layers.15.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
372
+ "model.layers.15.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
373
+ "model.layers.15.input_layernorm.weight": "model-00001-of-00002.safetensors",
374
+ "model.layers.15.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
375
+ "model.layers.16.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
376
+ "model.layers.16.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
377
+ "model.layers.16.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
378
+ "model.layers.16.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
379
+ "model.layers.16.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
380
+ "model.layers.16.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
381
+ "model.layers.16.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
382
+ "model.layers.16.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
383
+ "model.layers.16.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
384
+ "model.layers.16.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
385
+ "model.layers.16.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
386
+ "model.layers.16.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
387
+ "model.layers.16.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
388
+ "model.layers.16.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
389
+ "model.layers.16.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
390
+ "model.layers.16.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
391
+ "model.layers.16.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
392
+ "model.layers.16.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
393
+ "model.layers.16.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
394
+ "model.layers.16.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
395
+ "model.layers.16.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
396
+ "model.layers.16.input_layernorm.weight": "model-00001-of-00002.safetensors",
397
+ "model.layers.16.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
398
+ "model.layers.17.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
399
+ "model.layers.17.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
400
+ "model.layers.17.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
401
+ "model.layers.17.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
402
+ "model.layers.17.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
403
+ "model.layers.17.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
404
+ "model.layers.17.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
405
+ "model.layers.17.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
406
+ "model.layers.17.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
407
+ "model.layers.17.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
408
+ "model.layers.17.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
409
+ "model.layers.17.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
410
+ "model.layers.17.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
411
+ "model.layers.17.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
412
+ "model.layers.17.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
413
+ "model.layers.17.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
414
+ "model.layers.17.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
415
+ "model.layers.17.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
416
+ "model.layers.17.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
417
+ "model.layers.17.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
418
+ "model.layers.17.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
419
+ "model.layers.17.input_layernorm.weight": "model-00001-of-00002.safetensors",
420
+ "model.layers.17.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
421
+ "model.layers.18.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
422
+ "model.layers.18.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
423
+ "model.layers.18.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
424
+ "model.layers.18.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
425
+ "model.layers.18.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
426
+ "model.layers.18.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
427
+ "model.layers.18.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
428
+ "model.layers.18.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
429
+ "model.layers.18.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
430
+ "model.layers.18.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
431
+ "model.layers.18.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
432
+ "model.layers.18.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
433
+ "model.layers.18.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
434
+ "model.layers.18.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
435
+ "model.layers.18.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
436
+ "model.layers.18.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
437
+ "model.layers.18.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
438
+ "model.layers.18.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
439
+ "model.layers.18.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
440
+ "model.layers.18.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
441
+ "model.layers.18.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
442
+ "model.layers.18.input_layernorm.weight": "model-00001-of-00002.safetensors",
443
+ "model.layers.18.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
444
+ "model.layers.19.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
445
+ "model.layers.19.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
446
+ "model.layers.19.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
447
+ "model.layers.19.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
448
+ "model.layers.19.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
449
+ "model.layers.19.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
450
+ "model.layers.19.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
451
+ "model.layers.19.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
452
+ "model.layers.19.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
453
+ "model.layers.19.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
454
+ "model.layers.19.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
455
+ "model.layers.19.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
456
+ "model.layers.19.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
457
+ "model.layers.19.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
458
+ "model.layers.19.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
459
+ "model.layers.19.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
460
+ "model.layers.19.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
461
+ "model.layers.19.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
462
+ "model.layers.19.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
463
+ "model.layers.19.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
464
+ "model.layers.19.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
465
+ "model.layers.19.input_layernorm.weight": "model-00001-of-00002.safetensors",
466
+ "model.layers.19.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
467
+ "model.layers.20.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
468
+ "model.layers.20.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
469
+ "model.layers.20.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
470
+ "model.layers.20.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
471
+ "model.layers.20.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
472
+ "model.layers.20.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
473
+ "model.layers.20.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
474
+ "model.layers.20.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
475
+ "model.layers.20.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
476
+ "model.layers.20.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
477
+ "model.layers.20.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
478
+ "model.layers.20.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
479
+ "model.layers.20.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
480
+ "model.layers.20.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
481
+ "model.layers.20.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
482
+ "model.layers.20.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
483
+ "model.layers.20.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
484
+ "model.layers.20.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
485
+ "model.layers.20.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
486
+ "model.layers.20.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
487
+ "model.layers.20.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
488
+ "model.layers.20.input_layernorm.weight": "model-00001-of-00002.safetensors",
489
+ "model.layers.20.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
490
+ "model.layers.21.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
491
+ "model.layers.21.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
492
+ "model.layers.21.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
493
+ "model.layers.21.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
494
+ "model.layers.21.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
495
+ "model.layers.21.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
496
+ "model.layers.21.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
497
+ "model.layers.21.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
498
+ "model.layers.21.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
499
+ "model.layers.21.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
500
+ "model.layers.21.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
501
+ "model.layers.21.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
502
+ "model.layers.21.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
503
+ "model.layers.21.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
504
+ "model.layers.21.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
505
+ "model.layers.21.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
506
+ "model.layers.21.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
507
+ "model.layers.21.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
508
+ "model.layers.21.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
509
+ "model.layers.21.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
510
+ "model.layers.21.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
511
+ "model.layers.21.input_layernorm.weight": "model-00001-of-00002.safetensors",
512
+ "model.layers.21.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
513
+ "model.layers.22.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
514
+ "model.layers.22.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
515
+ "model.layers.22.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
516
+ "model.layers.22.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
517
+ "model.layers.22.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
518
+ "model.layers.22.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
519
+ "model.layers.22.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
520
+ "model.layers.22.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
521
+ "model.layers.22.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
522
+ "model.layers.22.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
523
+ "model.layers.22.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
524
+ "model.layers.22.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
525
+ "model.layers.22.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
526
+ "model.layers.22.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
527
+ "model.layers.22.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
528
+ "model.layers.22.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
529
+ "model.layers.22.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
530
+ "model.layers.22.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
531
+ "model.layers.22.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
532
+ "model.layers.22.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
533
+ "model.layers.22.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
534
+ "model.layers.22.input_layernorm.weight": "model-00001-of-00002.safetensors",
535
+ "model.layers.22.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
536
+ "model.layers.23.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
537
+ "model.layers.23.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
538
+ "model.layers.23.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
539
+ "model.layers.23.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
540
+ "model.layers.23.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
541
+ "model.layers.23.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
542
+ "model.layers.23.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
543
+ "model.layers.23.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
544
+ "model.layers.23.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
545
+ "model.layers.23.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
546
+ "model.layers.23.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
547
+ "model.layers.23.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
548
+ "model.layers.23.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
549
+ "model.layers.23.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
550
+ "model.layers.23.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
551
+ "model.layers.23.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
552
+ "model.layers.23.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
553
+ "model.layers.23.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
554
+ "model.layers.23.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
555
+ "model.layers.23.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
556
+ "model.layers.23.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
557
+ "model.layers.23.input_layernorm.weight": "model-00001-of-00002.safetensors",
558
+ "model.layers.23.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
559
+ "model.layers.24.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
560
+ "model.layers.24.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
561
+ "model.layers.24.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
562
+ "model.layers.24.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
563
+ "model.layers.24.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
564
+ "model.layers.24.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
565
+ "model.layers.24.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
566
+ "model.layers.24.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
567
+ "model.layers.24.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
568
+ "model.layers.24.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
569
+ "model.layers.24.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
570
+ "model.layers.24.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
571
+ "model.layers.24.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
572
+ "model.layers.24.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
573
+ "model.layers.24.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
574
+ "model.layers.24.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
575
+ "model.layers.24.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
576
+ "model.layers.24.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
577
+ "model.layers.24.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
578
+ "model.layers.24.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
579
+ "model.layers.24.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
580
+ "model.layers.24.input_layernorm.weight": "model-00001-of-00002.safetensors",
581
+ "model.layers.24.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
582
+ "model.layers.25.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
583
+ "model.layers.25.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
584
+ "model.layers.25.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
585
+ "model.layers.25.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
586
+ "model.layers.25.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
587
+ "model.layers.25.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
588
+ "model.layers.25.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
589
+ "model.layers.25.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
590
+ "model.layers.25.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
591
+ "model.layers.25.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
592
+ "model.layers.25.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
593
+ "model.layers.25.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
594
+ "model.layers.25.mlp.gate_proj.qweight": "model-00001-of-00002.safetensors",
595
+ "model.layers.25.mlp.gate_proj.qzeros": "model-00001-of-00002.safetensors",
596
+ "model.layers.25.mlp.gate_proj.scales": "model-00001-of-00002.safetensors",
597
+ "model.layers.25.mlp.up_proj.qweight": "model-00001-of-00002.safetensors",
598
+ "model.layers.25.mlp.up_proj.qzeros": "model-00001-of-00002.safetensors",
599
+ "model.layers.25.mlp.up_proj.scales": "model-00001-of-00002.safetensors",
600
+ "model.layers.25.mlp.down_proj.qweight": "model-00001-of-00002.safetensors",
601
+ "model.layers.25.mlp.down_proj.qzeros": "model-00001-of-00002.safetensors",
602
+ "model.layers.25.mlp.down_proj.scales": "model-00001-of-00002.safetensors",
603
+ "model.layers.25.input_layernorm.weight": "model-00001-of-00002.safetensors",
604
+ "model.layers.25.post_attention_layernorm.weight": "model-00001-of-00002.safetensors",
605
+ "model.layers.26.self_attn.q_proj.qweight": "model-00001-of-00002.safetensors",
606
+ "model.layers.26.self_attn.q_proj.qzeros": "model-00001-of-00002.safetensors",
607
+ "model.layers.26.self_attn.q_proj.scales": "model-00001-of-00002.safetensors",
608
+ "model.layers.26.self_attn.k_proj.qweight": "model-00001-of-00002.safetensors",
609
+ "model.layers.26.self_attn.k_proj.qzeros": "model-00001-of-00002.safetensors",
610
+ "model.layers.26.self_attn.k_proj.scales": "model-00001-of-00002.safetensors",
611
+ "model.layers.26.self_attn.v_proj.qweight": "model-00001-of-00002.safetensors",
612
+ "model.layers.26.self_attn.v_proj.qzeros": "model-00001-of-00002.safetensors",
613
+ "model.layers.26.self_attn.v_proj.scales": "model-00001-of-00002.safetensors",
614
+ "model.layers.26.self_attn.o_proj.qweight": "model-00001-of-00002.safetensors",
615
+ "model.layers.26.self_attn.o_proj.qzeros": "model-00001-of-00002.safetensors",
616
+ "model.layers.26.self_attn.o_proj.scales": "model-00001-of-00002.safetensors",
617
+ "model.layers.26.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
618
+ "model.layers.26.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
619
+ "model.layers.26.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
620
+ "model.layers.26.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
621
+ "model.layers.26.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
622
+ "model.layers.26.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
623
+ "model.layers.26.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
624
+ "model.layers.26.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
625
+ "model.layers.26.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
626
+ "model.layers.26.input_layernorm.weight": "model-00002-of-00002.safetensors",
627
+ "model.layers.26.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
628
+ "model.layers.27.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
629
+ "model.layers.27.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
630
+ "model.layers.27.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
631
+ "model.layers.27.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
632
+ "model.layers.27.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
633
+ "model.layers.27.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
634
+ "model.layers.27.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
635
+ "model.layers.27.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
636
+ "model.layers.27.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
637
+ "model.layers.27.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
638
+ "model.layers.27.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
639
+ "model.layers.27.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
640
+ "model.layers.27.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
641
+ "model.layers.27.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
642
+ "model.layers.27.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
643
+ "model.layers.27.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
644
+ "model.layers.27.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
645
+ "model.layers.27.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
646
+ "model.layers.27.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
647
+ "model.layers.27.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
648
+ "model.layers.27.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
649
+ "model.layers.27.input_layernorm.weight": "model-00002-of-00002.safetensors",
650
+ "model.layers.27.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
651
+ "model.layers.28.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
652
+ "model.layers.28.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
653
+ "model.layers.28.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
654
+ "model.layers.28.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
655
+ "model.layers.28.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
656
+ "model.layers.28.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
657
+ "model.layers.28.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
658
+ "model.layers.28.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
659
+ "model.layers.28.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
660
+ "model.layers.28.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
661
+ "model.layers.28.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
662
+ "model.layers.28.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
663
+ "model.layers.28.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
664
+ "model.layers.28.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
665
+ "model.layers.28.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
666
+ "model.layers.28.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
667
+ "model.layers.28.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
668
+ "model.layers.28.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
669
+ "model.layers.28.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
670
+ "model.layers.28.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
671
+ "model.layers.28.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
672
+ "model.layers.28.input_layernorm.weight": "model-00002-of-00002.safetensors",
673
+ "model.layers.28.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
674
+ "model.layers.29.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
675
+ "model.layers.29.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
676
+ "model.layers.29.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
677
+ "model.layers.29.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
678
+ "model.layers.29.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
679
+ "model.layers.29.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
680
+ "model.layers.29.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
681
+ "model.layers.29.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
682
+ "model.layers.29.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
683
+ "model.layers.29.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
684
+ "model.layers.29.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
685
+ "model.layers.29.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
686
+ "model.layers.29.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
687
+ "model.layers.29.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
688
+ "model.layers.29.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
689
+ "model.layers.29.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
690
+ "model.layers.29.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
691
+ "model.layers.29.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
692
+ "model.layers.29.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
693
+ "model.layers.29.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
694
+ "model.layers.29.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
695
+ "model.layers.29.input_layernorm.weight": "model-00002-of-00002.safetensors",
696
+ "model.layers.29.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
697
+ "model.layers.30.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
698
+ "model.layers.30.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
699
+ "model.layers.30.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
700
+ "model.layers.30.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
701
+ "model.layers.30.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
702
+ "model.layers.30.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
703
+ "model.layers.30.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
704
+ "model.layers.30.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
705
+ "model.layers.30.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
706
+ "model.layers.30.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
707
+ "model.layers.30.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
708
+ "model.layers.30.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
709
+ "model.layers.30.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
710
+ "model.layers.30.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
711
+ "model.layers.30.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
712
+ "model.layers.30.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
713
+ "model.layers.30.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
714
+ "model.layers.30.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
715
+ "model.layers.30.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
716
+ "model.layers.30.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
717
+ "model.layers.30.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
718
+ "model.layers.30.input_layernorm.weight": "model-00002-of-00002.safetensors",
719
+ "model.layers.30.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
720
+ "model.layers.31.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
721
+ "model.layers.31.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
722
+ "model.layers.31.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
723
+ "model.layers.31.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
724
+ "model.layers.31.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
725
+ "model.layers.31.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
726
+ "model.layers.31.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
727
+ "model.layers.31.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
728
+ "model.layers.31.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
729
+ "model.layers.31.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
730
+ "model.layers.31.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
731
+ "model.layers.31.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
732
+ "model.layers.31.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
733
+ "model.layers.31.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
734
+ "model.layers.31.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
735
+ "model.layers.31.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
736
+ "model.layers.31.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
737
+ "model.layers.31.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
738
+ "model.layers.31.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
739
+ "model.layers.31.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
740
+ "model.layers.31.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
741
+ "model.layers.31.input_layernorm.weight": "model-00002-of-00002.safetensors",
742
+ "model.layers.31.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
743
+ "model.layers.32.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
744
+ "model.layers.32.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
745
+ "model.layers.32.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
746
+ "model.layers.32.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
747
+ "model.layers.32.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
748
+ "model.layers.32.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
749
+ "model.layers.32.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
750
+ "model.layers.32.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
751
+ "model.layers.32.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
752
+ "model.layers.32.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
753
+ "model.layers.32.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
754
+ "model.layers.32.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
755
+ "model.layers.32.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
756
+ "model.layers.32.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
757
+ "model.layers.32.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
758
+ "model.layers.32.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
759
+ "model.layers.32.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
760
+ "model.layers.32.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
761
+ "model.layers.32.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
762
+ "model.layers.32.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
763
+ "model.layers.32.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
764
+ "model.layers.32.input_layernorm.weight": "model-00002-of-00002.safetensors",
765
+ "model.layers.32.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
766
+ "model.layers.33.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
767
+ "model.layers.33.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
768
+ "model.layers.33.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
769
+ "model.layers.33.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
770
+ "model.layers.33.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
771
+ "model.layers.33.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
772
+ "model.layers.33.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
773
+ "model.layers.33.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
774
+ "model.layers.33.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
775
+ "model.layers.33.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
776
+ "model.layers.33.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
777
+ "model.layers.33.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
778
+ "model.layers.33.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
779
+ "model.layers.33.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
780
+ "model.layers.33.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
781
+ "model.layers.33.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
782
+ "model.layers.33.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
783
+ "model.layers.33.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
784
+ "model.layers.33.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
785
+ "model.layers.33.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
786
+ "model.layers.33.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
787
+ "model.layers.33.input_layernorm.weight": "model-00002-of-00002.safetensors",
788
+ "model.layers.33.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
789
+ "model.layers.34.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
790
+ "model.layers.34.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
791
+ "model.layers.34.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
792
+ "model.layers.34.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
793
+ "model.layers.34.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
794
+ "model.layers.34.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
795
+ "model.layers.34.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
796
+ "model.layers.34.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
797
+ "model.layers.34.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
798
+ "model.layers.34.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
799
+ "model.layers.34.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
800
+ "model.layers.34.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
801
+ "model.layers.34.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
802
+ "model.layers.34.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
803
+ "model.layers.34.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
804
+ "model.layers.34.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
805
+ "model.layers.34.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
806
+ "model.layers.34.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
807
+ "model.layers.34.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
808
+ "model.layers.34.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
809
+ "model.layers.34.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
810
+ "model.layers.34.input_layernorm.weight": "model-00002-of-00002.safetensors",
811
+ "model.layers.34.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
812
+ "model.layers.35.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
813
+ "model.layers.35.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
814
+ "model.layers.35.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
815
+ "model.layers.35.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
816
+ "model.layers.35.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
817
+ "model.layers.35.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
818
+ "model.layers.35.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
819
+ "model.layers.35.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
820
+ "model.layers.35.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
821
+ "model.layers.35.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
822
+ "model.layers.35.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
823
+ "model.layers.35.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
824
+ "model.layers.35.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
825
+ "model.layers.35.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
826
+ "model.layers.35.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
827
+ "model.layers.35.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
828
+ "model.layers.35.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
829
+ "model.layers.35.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
830
+ "model.layers.35.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
831
+ "model.layers.35.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
832
+ "model.layers.35.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
833
+ "model.layers.35.input_layernorm.weight": "model-00002-of-00002.safetensors",
834
+ "model.layers.35.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
835
+ "model.layers.36.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
836
+ "model.layers.36.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
837
+ "model.layers.36.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
838
+ "model.layers.36.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
839
+ "model.layers.36.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
840
+ "model.layers.36.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
841
+ "model.layers.36.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
842
+ "model.layers.36.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
843
+ "model.layers.36.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
844
+ "model.layers.36.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
845
+ "model.layers.36.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
846
+ "model.layers.36.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
847
+ "model.layers.36.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
848
+ "model.layers.36.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
849
+ "model.layers.36.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
850
+ "model.layers.36.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
851
+ "model.layers.36.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
852
+ "model.layers.36.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
853
+ "model.layers.36.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
854
+ "model.layers.36.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
855
+ "model.layers.36.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
856
+ "model.layers.36.input_layernorm.weight": "model-00002-of-00002.safetensors",
857
+ "model.layers.36.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
858
+ "model.layers.37.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
859
+ "model.layers.37.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
860
+ "model.layers.37.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
861
+ "model.layers.37.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
862
+ "model.layers.37.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
863
+ "model.layers.37.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
864
+ "model.layers.37.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
865
+ "model.layers.37.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
866
+ "model.layers.37.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
867
+ "model.layers.37.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
868
+ "model.layers.37.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
869
+ "model.layers.37.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
870
+ "model.layers.37.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
871
+ "model.layers.37.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
872
+ "model.layers.37.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
873
+ "model.layers.37.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
874
+ "model.layers.37.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
875
+ "model.layers.37.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
876
+ "model.layers.37.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
877
+ "model.layers.37.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
878
+ "model.layers.37.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
879
+ "model.layers.37.input_layernorm.weight": "model-00002-of-00002.safetensors",
880
+ "model.layers.37.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
881
+ "model.layers.38.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
882
+ "model.layers.38.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
883
+ "model.layers.38.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
884
+ "model.layers.38.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
885
+ "model.layers.38.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
886
+ "model.layers.38.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
887
+ "model.layers.38.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
888
+ "model.layers.38.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
889
+ "model.layers.38.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
890
+ "model.layers.38.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
891
+ "model.layers.38.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
892
+ "model.layers.38.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
893
+ "model.layers.38.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
894
+ "model.layers.38.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
895
+ "model.layers.38.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
896
+ "model.layers.38.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
897
+ "model.layers.38.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
898
+ "model.layers.38.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
899
+ "model.layers.38.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
900
+ "model.layers.38.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
901
+ "model.layers.38.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
902
+ "model.layers.38.input_layernorm.weight": "model-00002-of-00002.safetensors",
903
+ "model.layers.38.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
904
+ "model.layers.39.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
905
+ "model.layers.39.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
906
+ "model.layers.39.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
907
+ "model.layers.39.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
908
+ "model.layers.39.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
909
+ "model.layers.39.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
910
+ "model.layers.39.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
911
+ "model.layers.39.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
912
+ "model.layers.39.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
913
+ "model.layers.39.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
914
+ "model.layers.39.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
915
+ "model.layers.39.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
916
+ "model.layers.39.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
917
+ "model.layers.39.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
918
+ "model.layers.39.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
919
+ "model.layers.39.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
920
+ "model.layers.39.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
921
+ "model.layers.39.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
922
+ "model.layers.39.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
923
+ "model.layers.39.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
924
+ "model.layers.39.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
925
+ "model.layers.39.input_layernorm.weight": "model-00002-of-00002.safetensors",
926
+ "model.layers.39.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
927
+ "model.layers.40.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
928
+ "model.layers.40.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
929
+ "model.layers.40.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
930
+ "model.layers.40.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
931
+ "model.layers.40.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
932
+ "model.layers.40.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
933
+ "model.layers.40.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
934
+ "model.layers.40.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
935
+ "model.layers.40.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
936
+ "model.layers.40.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
937
+ "model.layers.40.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
938
+ "model.layers.40.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
939
+ "model.layers.40.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
940
+ "model.layers.40.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
941
+ "model.layers.40.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
942
+ "model.layers.40.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
943
+ "model.layers.40.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
944
+ "model.layers.40.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
945
+ "model.layers.40.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
946
+ "model.layers.40.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
947
+ "model.layers.40.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
948
+ "model.layers.40.input_layernorm.weight": "model-00002-of-00002.safetensors",
949
+ "model.layers.40.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
950
+ "model.layers.41.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
951
+ "model.layers.41.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
952
+ "model.layers.41.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
953
+ "model.layers.41.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
954
+ "model.layers.41.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
955
+ "model.layers.41.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
956
+ "model.layers.41.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
957
+ "model.layers.41.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
958
+ "model.layers.41.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
959
+ "model.layers.41.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
960
+ "model.layers.41.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
961
+ "model.layers.41.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
962
+ "model.layers.41.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
963
+ "model.layers.41.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
964
+ "model.layers.41.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
965
+ "model.layers.41.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
966
+ "model.layers.41.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
967
+ "model.layers.41.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
968
+ "model.layers.41.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
969
+ "model.layers.41.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
970
+ "model.layers.41.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
971
+ "model.layers.41.input_layernorm.weight": "model-00002-of-00002.safetensors",
972
+ "model.layers.41.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
973
+ "model.layers.42.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
974
+ "model.layers.42.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
975
+ "model.layers.42.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
976
+ "model.layers.42.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
977
+ "model.layers.42.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
978
+ "model.layers.42.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
979
+ "model.layers.42.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
980
+ "model.layers.42.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
981
+ "model.layers.42.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
982
+ "model.layers.42.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
983
+ "model.layers.42.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
984
+ "model.layers.42.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
985
+ "model.layers.42.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
986
+ "model.layers.42.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
987
+ "model.layers.42.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
988
+ "model.layers.42.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
989
+ "model.layers.42.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
990
+ "model.layers.42.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
991
+ "model.layers.42.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
992
+ "model.layers.42.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
993
+ "model.layers.42.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
994
+ "model.layers.42.input_layernorm.weight": "model-00002-of-00002.safetensors",
995
+ "model.layers.42.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
996
+ "model.layers.43.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
997
+ "model.layers.43.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
998
+ "model.layers.43.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
999
+ "model.layers.43.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
1000
+ "model.layers.43.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
1001
+ "model.layers.43.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
1002
+ "model.layers.43.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
1003
+ "model.layers.43.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
1004
+ "model.layers.43.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
1005
+ "model.layers.43.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
1006
+ "model.layers.43.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
1007
+ "model.layers.43.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
1008
+ "model.layers.43.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
1009
+ "model.layers.43.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
1010
+ "model.layers.43.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
1011
+ "model.layers.43.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
1012
+ "model.layers.43.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
1013
+ "model.layers.43.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
1014
+ "model.layers.43.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
1015
+ "model.layers.43.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
1016
+ "model.layers.43.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
1017
+ "model.layers.43.input_layernorm.weight": "model-00002-of-00002.safetensors",
1018
+ "model.layers.43.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
1019
+ "model.layers.44.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
1020
+ "model.layers.44.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
1021
+ "model.layers.44.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
1022
+ "model.layers.44.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
1023
+ "model.layers.44.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
1024
+ "model.layers.44.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
1025
+ "model.layers.44.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
1026
+ "model.layers.44.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
1027
+ "model.layers.44.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
1028
+ "model.layers.44.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
1029
+ "model.layers.44.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
1030
+ "model.layers.44.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
1031
+ "model.layers.44.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
1032
+ "model.layers.44.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
1033
+ "model.layers.44.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
1034
+ "model.layers.44.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
1035
+ "model.layers.44.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
1036
+ "model.layers.44.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
1037
+ "model.layers.44.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
1038
+ "model.layers.44.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
1039
+ "model.layers.44.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
1040
+ "model.layers.44.input_layernorm.weight": "model-00002-of-00002.safetensors",
1041
+ "model.layers.44.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
1042
+ "model.layers.45.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
1043
+ "model.layers.45.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
1044
+ "model.layers.45.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
1045
+ "model.layers.45.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
1046
+ "model.layers.45.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
1047
+ "model.layers.45.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
1048
+ "model.layers.45.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
1049
+ "model.layers.45.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
1050
+ "model.layers.45.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
1051
+ "model.layers.45.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
1052
+ "model.layers.45.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
1053
+ "model.layers.45.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
1054
+ "model.layers.45.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
1055
+ "model.layers.45.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
1056
+ "model.layers.45.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
1057
+ "model.layers.45.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
1058
+ "model.layers.45.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
1059
+ "model.layers.45.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
1060
+ "model.layers.45.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
1061
+ "model.layers.45.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
1062
+ "model.layers.45.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
1063
+ "model.layers.45.input_layernorm.weight": "model-00002-of-00002.safetensors",
1064
+ "model.layers.45.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
1065
+ "model.layers.46.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
1066
+ "model.layers.46.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
1067
+ "model.layers.46.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
1068
+ "model.layers.46.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
1069
+ "model.layers.46.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
1070
+ "model.layers.46.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
1071
+ "model.layers.46.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
1072
+ "model.layers.46.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
1073
+ "model.layers.46.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
1074
+ "model.layers.46.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
1075
+ "model.layers.46.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
1076
+ "model.layers.46.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
1077
+ "model.layers.46.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
1078
+ "model.layers.46.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
1079
+ "model.layers.46.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
1080
+ "model.layers.46.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
1081
+ "model.layers.46.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
1082
+ "model.layers.46.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
1083
+ "model.layers.46.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
1084
+ "model.layers.46.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
1085
+ "model.layers.46.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
1086
+ "model.layers.46.input_layernorm.weight": "model-00002-of-00002.safetensors",
1087
+ "model.layers.46.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
1088
+ "model.layers.47.self_attn.q_proj.qweight": "model-00002-of-00002.safetensors",
1089
+ "model.layers.47.self_attn.q_proj.qzeros": "model-00002-of-00002.safetensors",
1090
+ "model.layers.47.self_attn.q_proj.scales": "model-00002-of-00002.safetensors",
1091
+ "model.layers.47.self_attn.k_proj.qweight": "model-00002-of-00002.safetensors",
1092
+ "model.layers.47.self_attn.k_proj.qzeros": "model-00002-of-00002.safetensors",
1093
+ "model.layers.47.self_attn.k_proj.scales": "model-00002-of-00002.safetensors",
1094
+ "model.layers.47.self_attn.v_proj.qweight": "model-00002-of-00002.safetensors",
1095
+ "model.layers.47.self_attn.v_proj.qzeros": "model-00002-of-00002.safetensors",
1096
+ "model.layers.47.self_attn.v_proj.scales": "model-00002-of-00002.safetensors",
1097
+ "model.layers.47.self_attn.o_proj.qweight": "model-00002-of-00002.safetensors",
1098
+ "model.layers.47.self_attn.o_proj.qzeros": "model-00002-of-00002.safetensors",
1099
+ "model.layers.47.self_attn.o_proj.scales": "model-00002-of-00002.safetensors",
1100
+ "model.layers.47.mlp.gate_proj.qweight": "model-00002-of-00002.safetensors",
1101
+ "model.layers.47.mlp.gate_proj.qzeros": "model-00002-of-00002.safetensors",
1102
+ "model.layers.47.mlp.gate_proj.scales": "model-00002-of-00002.safetensors",
1103
+ "model.layers.47.mlp.up_proj.qweight": "model-00002-of-00002.safetensors",
1104
+ "model.layers.47.mlp.up_proj.qzeros": "model-00002-of-00002.safetensors",
1105
+ "model.layers.47.mlp.up_proj.scales": "model-00002-of-00002.safetensors",
1106
+ "model.layers.47.mlp.down_proj.qweight": "model-00002-of-00002.safetensors",
1107
+ "model.layers.47.mlp.down_proj.qzeros": "model-00002-of-00002.safetensors",
1108
+ "model.layers.47.mlp.down_proj.scales": "model-00002-of-00002.safetensors",
1109
+ "model.layers.47.input_layernorm.weight": "model-00002-of-00002.safetensors",
1110
+ "model.layers.47.post_attention_layernorm.weight": "model-00002-of-00002.safetensors",
1111
+ "model.norm.weight": "model-00002-of-00002.safetensors",
1112
+ "lm_head.weight": "model-00002-of-00002.safetensors"
1113
+ }
1114
+ }
quant_config.json ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "zero_point": true,
3
+ "q_group_size": 128,
4
+ "w_bit": 4,
5
+ "version": "GEMM"
6
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": true,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "</s>",
11
+ "lstrip": false,
12
+ "normalized": true,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "unk_token": {
17
+ "content": "<unk>",
18
+ "lstrip": false,
19
+ "normalized": true,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ }
23
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347
3
+ size 499723
tokenizer_config.json ADDED
@@ -0,0 +1,33 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "__type": "AddedToken",
4
+ "content": "<s>",
5
+ "lstrip": false,
6
+ "normalized": true,
7
+ "rstrip": false,
8
+ "single_word": false
9
+ },
10
+ "clean_up_tokenization_spaces": false,
11
+ "eos_token": {
12
+ "__type": "AddedToken",
13
+ "content": "</s>",
14
+ "lstrip": false,
15
+ "normalized": true,
16
+ "rstrip": false,
17
+ "single_word": false
18
+ },
19
+ "legacy": null,
20
+ "model_max_length": 1000000000000000019884624838656,
21
+ "pad_token": null,
22
+ "sp_model_kwargs": {},
23
+ "tokenizer_class": "LlamaTokenizer",
24
+ "unk_token": {
25
+ "__type": "AddedToken",
26
+ "content": "<unk>",
27
+ "lstrip": false,
28
+ "normalized": true,
29
+ "rstrip": false,
30
+ "single_word": false
31
+ },
32
+ "use_default_system_prompt": true
33
+ }