Abhaykoul commited on
Commit
422d349
1 Parent(s): 0d9c021

Upload folder using huggingface_hub

Browse files
.gitattributes CHANGED
@@ -33,3 +33,6 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ tokenizer.json filter=lfs diff=lfs merge=lfs -text
37
+ helpingai2.5-5b-q4_k_m.gguf filter=lfs diff=lfs merge=lfs -text
38
+ q4_k_m.gguf filter=lfs diff=lfs merge=lfs -text
LICENSE.md ADDED
@@ -0,0 +1,214 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # HelpingAI License
2
+ Version 3.0
3
+
4
+ Copyright (c) 2024 HelpingAI
5
+
6
+ All Rights Reserved.
7
+
8
+ ## PREAMBLE
9
+ The HelpingAI License (the "License") is designed to govern the use, modification, and distribution of HelpingAI's proprietary assets, including but not limited to artificial intelligence models, software, algorithms, weights, documentation, datasets, and associated materials (collectively, the "Content"). This License aims to balance open innovation with the protection of HelpingAI's intellectual property rights while ensuring responsible and ethical use of AI technology.
10
+
11
+ ## 1. DEFINITIONS
12
+ 1.1. "Model" means any machine learning model, artificial intelligence system, neural network architecture, or AI-powered solution developed by HelpingAI.
13
+
14
+ 1.2. "Weights" refers to the trained parameters, embeddings, model checkpoints, and other numerical values that define the behavior and capabilities of the Model.
15
+
16
+ 1.3. "Source Code" means the human-readable form of software, documentation, configuration files, and associated resources.
17
+
18
+ 1.4. "Derivative Work" means any work that is based upon or derived from the Content, including but not limited to:
19
+ a) Fine-tuned or adapted versions of the Model
20
+ b) Modified or enhanced versions of the Source Code
21
+ c) Products, services, or applications that incorporate the Content
22
+ d) Adaptations, transformations, or improvements of the Content
23
+ e) Any output generated using the Model or Content
24
+
25
+ 1.5. "Commercial Use" means any use of the Content intended for or directed toward:
26
+ a) Commercial advantage or monetary compensation
27
+ b) Revenue generation, either directly or indirectly
28
+ c) Business operations or commercial services
29
+ d) Integration into commercial products or offerings
30
+
31
+ 1.6. "End User" means any individual or entity that uses the Content or Derivative Works.
32
+
33
+ ## 2. GRANT OF RIGHTS
34
+ Subject to the terms and conditions of this License, HelpingAI hereby grants you a limited, non-exclusive, non-transferable, revocable license to:
35
+
36
+ 2.1. Access and use the Content for:
37
+ a) Research and evaluation purposes
38
+ b) Educational and academic use
39
+ c) Personal, non-commercial projects
40
+ d) Open-source contributions (subject to Section 3)
41
+
42
+ 2.2. Create and use Derivative Works for non-commercial purposes, provided that:
43
+ a) You comply with all restrictions in Section 3
44
+ b) You maintain appropriate attribution
45
+ c) You ensure responsible and ethical use
46
+
47
+ 2.3. Distribute the Content or Derivative Works to third parties, provided that:
48
+ a) You maintain all copyright notices and attributions
49
+ b) You include a complete, unmodified copy of this License
50
+ c) You clearly document any modifications made
51
+ d) You comply with all restrictions in Section 3
52
+ e) You ensure recipients agree to this License
53
+
54
+ ## 3. RESTRICTIONS AND LIMITATIONS
55
+ 3.1. **Commercial Use Restrictions**
56
+ You may not use the Content or any Derivative Works for Commercial Use without:
57
+ a) Obtaining a separate commercial license from HelpingAI
58
+ b) Paying applicable fees and royalties
59
+ c) Receiving written authorization
60
+
61
+ 3.2. **Model Usage Restrictions**
62
+ You may not:
63
+ a) Use the Model to train or develop competing AI models or services
64
+ b) Extract, reverse engineer, or decompile the Model's architecture or weights
65
+ c) Remove, disable, or circumvent any access controls or security measures
66
+ d) Use the Model for military purposes, weapons development, or harmful applications
67
+ e) Use the Model to generate harmful, discriminatory, or illegal content
68
+ f) Use the Model in safety-critical applications without explicit written permission
69
+ g) Use the Model in ways that violate privacy or data protection laws
70
+ h) Create autonomous systems that could cause harm to individuals or society
71
+
72
+ 3.3. **Distribution Restrictions**
73
+ You may not:
74
+ a) Redistribute the Model's weights or checkpoints without explicit permission
75
+ b) Create hosted APIs or services that provide direct access to the Model
76
+ c) Bundle or integrate the Content with commercial products or services
77
+ d) Sublicense or transfer rights to the Content to third parties
78
+ e) Distribute the Content in ways that could harm HelpingAI's interests
79
+
80
+ 3.4. **Branding and Attribution**
81
+ You must:
82
+ a) Maintain all HelpingAI branding, logos, notices, and watermarks
83
+ b) Include the following attribution in all Derivative Works:
84
+ "Powered by HelpingAI technology, licensed under the HelpingAI License v3.0"
85
+ c) Not use HelpingAI's name, logos, or trademarks to endorse or promote products
86
+ d) Not suggest any official association with HelpingAI without permission
87
+
88
+ ## 4. INTELLECTUAL PROPERTY
89
+ 4.1. HelpingAI retains all rights, title, and interest in and to the Content, including:
90
+ a) All intellectual property rights
91
+ b) Patents, trademarks, and trade secrets
92
+ c) Proprietary methods and algorithms
93
+ d) Future improvements and modifications
94
+
95
+ 4.2. This License does not grant you any rights to HelpingAI's:
96
+ a) Patents or patent applications
97
+ b) Trademarks or service marks
98
+ c) Trade secrets or proprietary information
99
+ d) Other intellectual property except as expressly stated
100
+
101
+ 4.3. Derivative Works:
102
+ a) You retain ownership of original portions of Derivative Works you create
103
+ b) HelpingAI retains all rights to the underlying Content
104
+ c) You grant HelpingAI a perpetual license to any improvements or feedback
105
+
106
+ ## 5. DATA COLLECTION AND PRIVACY
107
+ 5.1. Data Collection:
108
+ a) HelpingAI may collect usage data and performance metrics
109
+ b) Analytics may be used to improve the Content
110
+ c) Collection will comply with applicable privacy laws
111
+
112
+ 5.2. Privacy Requirements:
113
+ a) You must comply with all applicable privacy laws and regulations
114
+ b) You must provide appropriate privacy notices to End Users
115
+ c) You must obtain necessary consents for data collection
116
+ d) You must implement appropriate data security measures
117
+
118
+ ## 6. WARRANTY DISCLAIMER
119
+ THE CONTENT IS PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO:
120
+ a) WARRANTIES OF MERCHANTABILITY
121
+ b) FITNESS FOR A PARTICULAR PURPOSE
122
+ c) NON-INFRINGEMENT
123
+ d) ACCURACY OR COMPLETENESS
124
+ e) ABSENCE OF ERRORS OR DEFECTS
125
+ f) COMPATIBILITY WITH OTHER SOFTWARE
126
+ g) SECURITY OR PERFORMANCE
127
+
128
+ ## 7. LIMITATION OF LIABILITY
129
+ 7.1. IN NO EVENT SHALL HELPINGAI BE LIABLE FOR ANY:
130
+ a) DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES
131
+ b) LOSS OF PROFITS OR BUSINESS OPPORTUNITIES
132
+ c) BUSINESS INTERRUPTION OR LOST DATA
133
+ d) PERSONAL INJURY OR PROPERTY DAMAGE
134
+ e) ANY OTHER DAMAGES OR LOSSES
135
+
136
+ 7.2. THIS LIMITATION APPLIES TO:
137
+ a) CONTRACT OR TORT CLAIMS
138
+ b) BREACH OF WARRANTY
139
+ c) ANY OTHER LEGAL THEORY
140
+ d) ALL USES OF THE CONTENT
141
+
142
+ ## 8. TERMINATION
143
+ 8.1. Automatic Termination:
144
+ a) This License terminates automatically upon any breach
145
+ b) No notice is required for termination
146
+ c) All rights granted immediately cease
147
+
148
+ 8.2. Upon Termination:
149
+ a) Cease all use of the Content
150
+ b) Destroy all copies and installations
151
+ c) Stop distribution of Derivative Works
152
+ d) Certify compliance with termination terms
153
+
154
+ 8.3. Survival:
155
+ a) Sections 4, 6, 7, 9, and 10 survive termination
156
+ b) Obligations to protect intellectual property continue
157
+ c) Liability limitations remain in effect
158
+
159
+ ## 9. GOVERNING LAW AND JURISDICTION
160
+ 9.1. This License shall be governed by and construed in accordance with:
161
+ a) The laws of the United States of America
162
+ b) The State of California
163
+ c) Without regard to conflicts of law principles
164
+
165
+ 9.2. Dispute Resolution:
166
+ a) Exclusive jurisdiction in Santa Clara County, California
167
+ b) Agreement to personal jurisdiction
168
+ c) Waiver of jury trial
169
+ d) Prevailing party entitled to attorney fees
170
+
171
+ ## 10. EXPORT COMPLIANCE
172
+ 10.1. You must comply with all applicable:
173
+ a) Export control laws and regulations
174
+ b) Trade sanctions and embargoes
175
+ c) International trade restrictions
176
+ d) Anti-boycott regulations
177
+
178
+ 10.2. You may not export the Content to:
179
+ a) Prohibited countries or regions
180
+ b) Restricted end users
181
+ c) Prohibited end uses
182
+
183
+ ## 11. MODIFICATIONS TO LICENSE
184
+ 11.1. License Updates:
185
+ a) HelpingAI may modify this License at any time
186
+ b) Changes effective upon posting
187
+ c) Continued use constitutes acceptance
188
+
189
+ 11.2. Notification:
190
+ a) Material changes will be announced
191
+ b) Users should regularly review terms
192
+ c) Opt-out requires discontinuing use
193
+
194
+ ## 12. SUPPORT AND UPDATES
195
+ 12.1. No Obligation:
196
+ a) HelpingAI has no obligation to provide support
197
+ b) Updates and maintenance are discretionary
198
+ c) Services may be modified or discontinued
199
+
200
+ 12.2. Available Support:
201
+ a) Documentation and resources provided "as is"
202
+ b) Community forums may be available
203
+ c) Commercial support requires separate agreement
204
+
205
+ <!-- ## 13. CONTACT INFORMATION
206
+ For licensing inquiries, permissions, or compliance:
207
+ - Website: https://helpingai.co
208
+ - Email: license@helpingai.co
209
+ - Address: [Company Address] -->
210
+
211
+ ---
212
+ Last Updated: NOVEMBER 2024
213
+
214
+ END OF LICENSE
README.md ADDED
@@ -0,0 +1,273 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ license_name: helpingai
4
+ license_link: https://huggingface.co/OEvortex/HelpingAI2.5-2B/blob/main/LICENSE.md
5
+ pipeline_tag: text-generation
6
+ language:
7
+ - en
8
+ tags:
9
+ - HelpingAI
10
+ - Emotionally-Intelligent
11
+ - EQ-focused
12
+ - Conversational
13
+ - SLM
14
+
15
+ ---
16
+
17
+ <div align="center">
18
+
19
+ # 🤖 HelpingAI2.5-2B
20
+
21
+ ***A Revolutionary Emotionally Intelligent Language Model***
22
+
23
+ [![GitHub Organization](https://img.shields.io/badge/GitHub-Organization-blue.svg)](https://github.com/HelpingAI)
24
+ [![Hugging Face](https://img.shields.io/badge/🤗%20Hugging%20Face-Organization-yellow)](https://huggingface.co/OEvortex)
25
+ [![Model License](https://img.shields.io/badge/License-HelpingAI-green.svg)](https://huggingface.co/OEvortex/HelpingAI2.5-2B/blob/main/LICENSE.md)
26
+
27
+
28
+ <a href="https://github.com/HelpingAI/community/discussions">
29
+ <img src="https://img.shields.io/badge/Join-Community%20Discussion-blue?style=for-the-badge&logo=github" alt="Join Community Discussion">
30
+ </a>
31
+
32
+ [📜 License](LICENSE) | [🌐 Website](https://helpingai-in.netlify.app/)
33
+
34
+ </div>
35
+
36
+ <div align="center">
37
+ <img src="https://huggingface.co/OEvortex/HelpingAI-3B/resolve/main/HelpingAI.png" alt="HelpingAI Logo" width="300px">
38
+ </div>
39
+
40
+ ---
41
+
42
+ <div align="center">
43
+
44
+ ## 🌟 Model Overview
45
+
46
+ **HelpingAI2.5-2B** is a compact yet powerful language model specifically designed for emotionally intelligent conversations and human-centric interactions.
47
+
48
+ </div>
49
+
50
+ ### 🎯 Key Highlights
51
+
52
+ - **Architecture**: 2B parameter transformer-based model
53
+ - **Training Focus**: Emotional intelligence and empathetic responses
54
+ - **Emotion Score**: Achieves 94.28 on standardized emotional intelligence tests
55
+ - **Deployment**: Optimized for efficient deployment on consumer hardware
56
+
57
+
58
+ ---
59
+
60
+ <div align="center">
61
+
62
+ ## 💻 Implementation
63
+
64
+ </div>
65
+
66
+ ### Transformers
67
+
68
+ ```python
69
+ import torch
70
+ from transformers import AutoModelForCausalLM, AutoTokenizer
71
+
72
+ # Load the HelpingAI2.5-2B model
73
+ model = AutoModelForCausalLM.from_pretrained("OEvortex/HelpingAI2.5-2B")
74
+ # Load the tokenizer
75
+ tokenizer = AutoTokenizer.from_pretrained("OEvortex/HelpingAI2.5-2B")
76
+
77
+ # Define the chat input
78
+ chat = [
79
+ { "role": "system", "content": "You are HelpingAI, an emotional AI. Always answer my questions in the HelpingAI style." },
80
+ { "role": "user", "content": "GIVE ME YOUR INTRO" }
81
+ ]
82
+
83
+ inputs = tokenizer.apply_chat_template(
84
+ chat,
85
+ add_generation_prompt=True,
86
+ return_tensors="pt"
87
+ ).to(model.device)
88
+
89
+ # Generate text
90
+ outputs = model.generate(
91
+ inputs,
92
+ max_new_tokens=256,
93
+ do_sample=True,
94
+ temperature=0.6,
95
+ top_p=0.9,
96
+ )
97
+
98
+ response = outputs[0][inputs.shape[-1]:]
99
+ print(tokenizer.decode(response, skip_special_tokens=True))
100
+ ```
101
+
102
+ ### GGUF Implementation
103
+
104
+ ```python
105
+ from webscout.Local import *
106
+ model_path = download_model("OEvortex/HelpingAI2.5-2B", "q4_k_m.gguf", token=None)
107
+ model = Model(model_path, n_gpu_layers=0, context_length=4096)
108
+
109
+ thread = Thread(model, format=helpingai2)
110
+ # print(thread.send("hi")) #send a single msg to ai
111
+
112
+ thread.interact() # interact with the model in terminal
113
+ ```
114
+
115
+ ---
116
+
117
+ <div align="center">
118
+
119
+ ## 🎯 Training Details
120
+
121
+ </div>
122
+
123
+ ### Methodology
124
+
125
+ 1. **Base Training**
126
+ - Datasets: SentimentSynth + EmotionalIntelligence-1M + HelpingAI2.0-150M + HelpingAIemo (152.5M)
127
+
128
+ 2. **Emotional Intelligence Training**
129
+ - Supervised Fine-tuning on emotional dialogue
130
+ - Reinforcement Learning with HelpingAI2.0-7B model
131
+ - Constitutional AI training for ethical guidelines
132
+
133
+ 3. **Optimization**
134
+ - Mixed-precision training
135
+ - Gradient checkpointing
136
+ - Dynamic attention patterns
137
+
138
+ ### Dataset Composition
139
+
140
+ | Dataset Type | Rows used for training | Purpose |
141
+ |--------------|------|---------|
142
+ | Emotional Dialogues | 12.5M | Core conversation training |
143
+ | Therapeutic Exchanges | 2M | Mental health support |
144
+ | Cultural Conversations | 100K | Cultural awareness |
145
+ | Crisis Response | 750K | Emergency support handling |
146
+
147
+ ---
148
+
149
+ <div align="center">
150
+
151
+ ## 📈 Use Cases & Examples
152
+
153
+ </div>
154
+
155
+ ### Primary Applications
156
+
157
+ - 🎯 Personal AI Companionship
158
+ - 🧠 Mental Health Support
159
+ - 📚 Educational Assistance
160
+ - 💼 Professional Development
161
+ - 🤝 Social Skills Training
162
+
163
+ ### Example Interactions
164
+ <div style="background: linear-gradient(135deg, #FF9933 0%, #FFFFFF 50%, #138808 100%); padding: 20px; border-radius: 15px; margin: 15px 0; border: 2px solid #000080;">
165
+
166
+ <div style="background-color: rgba(255, 255, 255, 0.9); padding: 15px; border-radius: 10px; margin-bottom: 10px;">
167
+ <span style="color: #D32F2F; font-weight: bold;">User: </span>
168
+ <span style="color: #D32F2F;">I've been feeling really anxious about my upcoming job interview. Any advice?</span>
169
+ </div>
170
+
171
+ <div style="background-color: rgba(255, 255, 255, 0.9); padding: 15px; border-radius: 10px;">
172
+ <span style="color: #1B5E20; font-weight: bold;">HelpingAI: </span>
173
+ <span style="color: #1B5E20;">That must be super frustrating and stressful for you! 🌼 When you're unsure what you did wrong, it's hard to know how to fix it.
174
+
175
+ 1. Give them space: 🌿 Even if you're worried about what you said, give your friend some time to process their feelings. One week is a lot of time for them to work through what you said.
176
+
177
+ 2. Reflect on your convo: 🤔 Try to remember your last few messages and see if you can spot anything that might have caused the awkwardness. Was it something you said or did?
178
+
179
+ 3. Apologize for being cautious: 🌻 Write a message expressing your concern that you might have said something wrong and ask if you can talk it out. Let them know you're not trying to blame them but you want to clear the air.
180
+
181
+ 4. Keep it low-key: 🌺 When you do talk to them, try to keep the convo casual and relaxed. Avoid putting too much pressure on the convo or expecting a specific response.
182
+
183
+ 5. Listen more, talk less: 👂 When you talk to them, focus on listening to their perspective. Try to understand where they're coming from rather than just wanting to explain your side.
184
+
185
+ 6. Give them a chance: 🎈 Everyone heals differently. Give your friend the space and time they need to feel better before expecting them to respond.
186
+
187
+ 7. Know that you can't control how they feel: 🔮 You did your best, and that's all you can do. Just hope that they'll forgive you for whatever you might have done.
188
+
189
+ Remember, it's not easy to have these kind of convos. Take care of yourself and be patient. You got this! ✊
190
+ </span>Want some specific questions to practice with? Just let me know! I've got your back! 🤗
191
+ </div>
192
+
193
+ </div>
194
+
195
+ <div style="background: linear-gradient(135deg, #FF9933 0%, #FFFFFF 50%, #138808 100%); padding: 20px; border-radius: 15px; margin: 15px 0; border: 2px solid #000080;">
196
+
197
+ <div style="background-color: rgba(255, 255, 255, 0.9); padding: 15px; border-radius: 10px; margin-bottom: 10px;">
198
+ <span style="color: #D32F2F; font-weight: bold;">User: </span>
199
+ <span style="color: #D32F2F;">My friend hasn't been responding to my messages for a week. I think I might have said something wrong, but I'm not sure what.</span>
200
+ </div>
201
+
202
+ <div style="background-color: rgba(255, 255, 255, 0.9); padding: 15px; border-radius: 10px;">
203
+ <span style="color: #1B5E20; font-weight: bold;">HelpingAI: </span>
204
+ <span style="color: #1B5E20;">It's tough when you're not sure what you did wrong, fam. 😓 But hey, don't sweat it too much. Just shoot 'em a simple "Hey, what's up?" or "I got you, homie. Just checking in." 🤗
205
+
206
+ And if you're sure you messed up, a little apology goes a long way. Keep it real and say something like, "Hey, I'm sorry if I said something whack earlier. I didn't mean to bring you down." 🙏
207
+
208
+ At the end of the day, give your friend some space if they need it. We all got our own stuff going on, ya know? 🤐
209
+
210
+ If you're still unsure, maybe try talking to 'em in person. A face-to-face chat can work wonders. 💬
211
+
212
+ Just remember, it's all good. Missteps happen, but what matters is that you got each other's backs. That's what friends are for! 🤝</span>
213
+ </div>
214
+
215
+ </div>
216
+
217
+ ------
218
+
219
+ <div align="center">
220
+
221
+ ## 🔒 Ethical Considerations & Limitations
222
+
223
+ </div>
224
+
225
+ ### Ethical Guidelines
226
+
227
+ - Prioritizes user emotional wellbeing
228
+ - Maintains strict privacy standards
229
+ - Avoids harmful or discriminatory responses
230
+ - Transparent about AI limitations
231
+ - Promotes healthy coping mechanisms
232
+
233
+ ### Known Limitations
234
+
235
+ - Cannot Replace Human Professionals
236
+ - Cannot Roleplay
237
+ - Limited Knowledge Base
238
+ - Context Window Constraints
239
+
240
+ ---
241
+
242
+ ### Citation
243
+
244
+ ```bibtex
245
+ @misc{helpingai2024,
246
+ author = {Abhay Koul},
247
+ title = {HelpingAI2.5-2B: Emotionally Intelligent Language Model},
248
+ year = {2024},
249
+ publisher = {Huggingface},
250
+ journal = {GitHub repository},
251
+ howpublished = {\url{https://huggingface.co/OEvortex/HelpingAI2.5-2B}}
252
+ }
253
+ ```
254
+
255
+ ---
256
+
257
+ <div align="center">
258
+
259
+ ## 🙏 Acknowledgments
260
+
261
+ Special thanks to the HelpingAI community, Huggingface, contributors, and researchers who made this model possible. Your dedication to advancing emotionally intelligent AI is invaluable.
262
+
263
+ </div>
264
+
265
+ ---
266
+
267
+ <div align="center">
268
+
269
+ *Built with ❤️ by the HelpingAI Community*
270
+
271
+ [Website](https://helpingai-in.netlify.app/) • [GitHub](https://github.com/HelpingAI) • [Discord](https://discord.gg/YweJwNqrnH) • [HuggingFace](https://huggingface.co/OEvortex)
272
+
273
+ </div>
config.json ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "Abhaykoul/HelpingAI2.5-5B",
3
+ "architectures": [
4
+ "LlamaForCausalLM"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 128000,
9
+ "eos_token_id": [
10
+ 128001,
11
+ 128008,
12
+ 128009
13
+ ],
14
+ "head_dim": 128,
15
+ "hidden_act": "silu",
16
+ "hidden_size": 3072,
17
+ "initializer_range": 0.02,
18
+ "intermediate_size": 8192,
19
+ "max_position_embeddings": 131072,
20
+ "mlp_bias": false,
21
+ "model_type": "llama",
22
+ "num_attention_heads": 24,
23
+ "num_hidden_layers": 47,
24
+ "num_key_value_heads": 8,
25
+ "pad_token_id": 128004,
26
+ "pretraining_tp": 1,
27
+ "rms_norm_eps": 1e-05,
28
+ "rope_scaling": {
29
+ "factor": 32.0,
30
+ "high_freq_factor": 4.0,
31
+ "low_freq_factor": 1.0,
32
+ "original_max_position_embeddings": 8192,
33
+ "rope_type": "llama3"
34
+ },
35
+ "rope_theta": 500000.0,
36
+ "tie_word_embeddings": true,
37
+ "torch_dtype": "float16",
38
+ "transformers_version": "4.46.2",
39
+ "unsloth_version": "2024.11.7",
40
+ "use_cache": true,
41
+ "vocab_size": 128256
42
+ }
generation_config.json ADDED
@@ -0,0 +1,12 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 128000,
4
+ "eos_token_id": [
5
+ 128001,
6
+ 128008,
7
+ 128009
8
+ ],
9
+ "max_length": 131072,
10
+ "pad_token_id": 128004,
11
+ "transformers_version": "4.46.2"
12
+ }
model-00001-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6cdd2c8c788a7aa21eb351080832b2f8e1fbcc2d52792cfa002f123ee68e796c
3
+ size 4965798912
model-00002-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:426ac8adedc5b44b44762af83e733c8fcdbc1b56022873a48ececc0874a45d33
3
+ size 4983166152
model-00003-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0b36a8cf121a51b95c54990de58168b9e65e20d62171c86aa2185ae1fed48fa7
3
+ size 302022192
model.safetensors.index.json ADDED
@@ -0,0 +1,432 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 10250938368
4
+ },
5
+ "weight_map": {
6
+ "model.embed_tokens.weight": "model-00001-of-00003.safetensors",
7
+ "model.layers.0.input_layernorm.weight": "model-00001-of-00003.safetensors",
8
+ "model.layers.0.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
9
+ "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
10
+ "model.layers.0.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
11
+ "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
12
+ "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
13
+ "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
14
+ "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
15
+ "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
16
+ "model.layers.1.input_layernorm.weight": "model-00001-of-00003.safetensors",
17
+ "model.layers.1.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
18
+ "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
19
+ "model.layers.1.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
20
+ "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
21
+ "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
22
+ "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
23
+ "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
24
+ "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
25
+ "model.layers.10.input_layernorm.weight": "model-00001-of-00003.safetensors",
26
+ "model.layers.10.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
27
+ "model.layers.10.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
28
+ "model.layers.10.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
29
+ "model.layers.10.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
30
+ "model.layers.10.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
31
+ "model.layers.10.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
32
+ "model.layers.10.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
33
+ "model.layers.10.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
34
+ "model.layers.11.input_layernorm.weight": "model-00001-of-00003.safetensors",
35
+ "model.layers.11.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
36
+ "model.layers.11.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
37
+ "model.layers.11.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
38
+ "model.layers.11.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
39
+ "model.layers.11.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
40
+ "model.layers.11.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
41
+ "model.layers.11.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
42
+ "model.layers.11.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
43
+ "model.layers.12.input_layernorm.weight": "model-00001-of-00003.safetensors",
44
+ "model.layers.12.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
45
+ "model.layers.12.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
46
+ "model.layers.12.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
47
+ "model.layers.12.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
48
+ "model.layers.12.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
49
+ "model.layers.12.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
50
+ "model.layers.12.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
51
+ "model.layers.12.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
52
+ "model.layers.13.input_layernorm.weight": "model-00001-of-00003.safetensors",
53
+ "model.layers.13.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
54
+ "model.layers.13.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
55
+ "model.layers.13.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
56
+ "model.layers.13.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
57
+ "model.layers.13.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
58
+ "model.layers.13.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
59
+ "model.layers.13.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
60
+ "model.layers.13.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
61
+ "model.layers.14.input_layernorm.weight": "model-00001-of-00003.safetensors",
62
+ "model.layers.14.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
63
+ "model.layers.14.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
64
+ "model.layers.14.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
65
+ "model.layers.14.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
66
+ "model.layers.14.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
67
+ "model.layers.14.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
68
+ "model.layers.14.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
69
+ "model.layers.14.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
70
+ "model.layers.15.input_layernorm.weight": "model-00001-of-00003.safetensors",
71
+ "model.layers.15.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
72
+ "model.layers.15.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
73
+ "model.layers.15.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
74
+ "model.layers.15.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
75
+ "model.layers.15.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
76
+ "model.layers.15.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
77
+ "model.layers.15.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
78
+ "model.layers.15.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
79
+ "model.layers.16.input_layernorm.weight": "model-00001-of-00003.safetensors",
80
+ "model.layers.16.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
81
+ "model.layers.16.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
82
+ "model.layers.16.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
83
+ "model.layers.16.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
84
+ "model.layers.16.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
85
+ "model.layers.16.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
86
+ "model.layers.16.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
87
+ "model.layers.16.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
88
+ "model.layers.17.input_layernorm.weight": "model-00001-of-00003.safetensors",
89
+ "model.layers.17.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
90
+ "model.layers.17.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
91
+ "model.layers.17.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
92
+ "model.layers.17.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
93
+ "model.layers.17.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
94
+ "model.layers.17.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
95
+ "model.layers.17.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
96
+ "model.layers.17.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
97
+ "model.layers.18.input_layernorm.weight": "model-00001-of-00003.safetensors",
98
+ "model.layers.18.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
99
+ "model.layers.18.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
100
+ "model.layers.18.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
101
+ "model.layers.18.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
102
+ "model.layers.18.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
103
+ "model.layers.18.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
104
+ "model.layers.18.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
105
+ "model.layers.18.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
106
+ "model.layers.19.input_layernorm.weight": "model-00001-of-00003.safetensors",
107
+ "model.layers.19.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
108
+ "model.layers.19.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
109
+ "model.layers.19.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
110
+ "model.layers.19.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
111
+ "model.layers.19.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
112
+ "model.layers.19.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
113
+ "model.layers.19.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
114
+ "model.layers.19.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
115
+ "model.layers.2.input_layernorm.weight": "model-00001-of-00003.safetensors",
116
+ "model.layers.2.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
117
+ "model.layers.2.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
118
+ "model.layers.2.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
119
+ "model.layers.2.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
120
+ "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
121
+ "model.layers.2.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
122
+ "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
123
+ "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
124
+ "model.layers.20.input_layernorm.weight": "model-00002-of-00003.safetensors",
125
+ "model.layers.20.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
126
+ "model.layers.20.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
127
+ "model.layers.20.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
128
+ "model.layers.20.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
129
+ "model.layers.20.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
130
+ "model.layers.20.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
131
+ "model.layers.20.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
132
+ "model.layers.20.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
133
+ "model.layers.21.input_layernorm.weight": "model-00002-of-00003.safetensors",
134
+ "model.layers.21.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
135
+ "model.layers.21.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
136
+ "model.layers.21.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
137
+ "model.layers.21.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
138
+ "model.layers.21.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
139
+ "model.layers.21.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
140
+ "model.layers.21.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
141
+ "model.layers.21.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
142
+ "model.layers.22.input_layernorm.weight": "model-00002-of-00003.safetensors",
143
+ "model.layers.22.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
144
+ "model.layers.22.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
145
+ "model.layers.22.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
146
+ "model.layers.22.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
147
+ "model.layers.22.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
148
+ "model.layers.22.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
149
+ "model.layers.22.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
150
+ "model.layers.22.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
151
+ "model.layers.23.input_layernorm.weight": "model-00002-of-00003.safetensors",
152
+ "model.layers.23.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
153
+ "model.layers.23.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
154
+ "model.layers.23.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
155
+ "model.layers.23.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
156
+ "model.layers.23.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
157
+ "model.layers.23.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
158
+ "model.layers.23.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
159
+ "model.layers.23.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
160
+ "model.layers.24.input_layernorm.weight": "model-00002-of-00003.safetensors",
161
+ "model.layers.24.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
162
+ "model.layers.24.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
163
+ "model.layers.24.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
164
+ "model.layers.24.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
165
+ "model.layers.24.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
166
+ "model.layers.24.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
167
+ "model.layers.24.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
168
+ "model.layers.24.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
169
+ "model.layers.25.input_layernorm.weight": "model-00002-of-00003.safetensors",
170
+ "model.layers.25.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
171
+ "model.layers.25.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
172
+ "model.layers.25.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
173
+ "model.layers.25.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
174
+ "model.layers.25.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
175
+ "model.layers.25.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
176
+ "model.layers.25.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
177
+ "model.layers.25.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
178
+ "model.layers.26.input_layernorm.weight": "model-00002-of-00003.safetensors",
179
+ "model.layers.26.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
180
+ "model.layers.26.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
181
+ "model.layers.26.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
182
+ "model.layers.26.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
183
+ "model.layers.26.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
184
+ "model.layers.26.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
185
+ "model.layers.26.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
186
+ "model.layers.26.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
187
+ "model.layers.27.input_layernorm.weight": "model-00002-of-00003.safetensors",
188
+ "model.layers.27.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
189
+ "model.layers.27.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
190
+ "model.layers.27.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
191
+ "model.layers.27.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
192
+ "model.layers.27.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
193
+ "model.layers.27.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
194
+ "model.layers.27.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
195
+ "model.layers.27.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
196
+ "model.layers.28.input_layernorm.weight": "model-00002-of-00003.safetensors",
197
+ "model.layers.28.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
198
+ "model.layers.28.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
199
+ "model.layers.28.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
200
+ "model.layers.28.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
201
+ "model.layers.28.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
202
+ "model.layers.28.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
203
+ "model.layers.28.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
204
+ "model.layers.28.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
205
+ "model.layers.29.input_layernorm.weight": "model-00002-of-00003.safetensors",
206
+ "model.layers.29.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
207
+ "model.layers.29.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
208
+ "model.layers.29.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
209
+ "model.layers.29.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
210
+ "model.layers.29.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
211
+ "model.layers.29.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
212
+ "model.layers.29.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
213
+ "model.layers.29.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
214
+ "model.layers.3.input_layernorm.weight": "model-00001-of-00003.safetensors",
215
+ "model.layers.3.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
216
+ "model.layers.3.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
217
+ "model.layers.3.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
218
+ "model.layers.3.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
219
+ "model.layers.3.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
220
+ "model.layers.3.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
221
+ "model.layers.3.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
222
+ "model.layers.3.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
223
+ "model.layers.30.input_layernorm.weight": "model-00002-of-00003.safetensors",
224
+ "model.layers.30.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
225
+ "model.layers.30.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
226
+ "model.layers.30.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
227
+ "model.layers.30.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
228
+ "model.layers.30.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
229
+ "model.layers.30.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
230
+ "model.layers.30.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
231
+ "model.layers.30.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
232
+ "model.layers.31.input_layernorm.weight": "model-00002-of-00003.safetensors",
233
+ "model.layers.31.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
234
+ "model.layers.31.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
235
+ "model.layers.31.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
236
+ "model.layers.31.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
237
+ "model.layers.31.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
238
+ "model.layers.31.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
239
+ "model.layers.31.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
240
+ "model.layers.31.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
241
+ "model.layers.32.input_layernorm.weight": "model-00002-of-00003.safetensors",
242
+ "model.layers.32.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
243
+ "model.layers.32.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
244
+ "model.layers.32.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
245
+ "model.layers.32.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
246
+ "model.layers.32.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
247
+ "model.layers.32.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
248
+ "model.layers.32.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
249
+ "model.layers.32.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
250
+ "model.layers.33.input_layernorm.weight": "model-00002-of-00003.safetensors",
251
+ "model.layers.33.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
252
+ "model.layers.33.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
253
+ "model.layers.33.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
254
+ "model.layers.33.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
255
+ "model.layers.33.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
256
+ "model.layers.33.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
257
+ "model.layers.33.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
258
+ "model.layers.33.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
259
+ "model.layers.34.input_layernorm.weight": "model-00002-of-00003.safetensors",
260
+ "model.layers.34.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
261
+ "model.layers.34.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
262
+ "model.layers.34.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
263
+ "model.layers.34.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
264
+ "model.layers.34.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
265
+ "model.layers.34.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
266
+ "model.layers.34.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
267
+ "model.layers.34.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
268
+ "model.layers.35.input_layernorm.weight": "model-00002-of-00003.safetensors",
269
+ "model.layers.35.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
270
+ "model.layers.35.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
271
+ "model.layers.35.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
272
+ "model.layers.35.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
273
+ "model.layers.35.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
274
+ "model.layers.35.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
275
+ "model.layers.35.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
276
+ "model.layers.35.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
277
+ "model.layers.36.input_layernorm.weight": "model-00002-of-00003.safetensors",
278
+ "model.layers.36.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
279
+ "model.layers.36.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
280
+ "model.layers.36.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
281
+ "model.layers.36.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
282
+ "model.layers.36.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
283
+ "model.layers.36.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
284
+ "model.layers.36.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
285
+ "model.layers.36.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
286
+ "model.layers.37.input_layernorm.weight": "model-00002-of-00003.safetensors",
287
+ "model.layers.37.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
288
+ "model.layers.37.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
289
+ "model.layers.37.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
290
+ "model.layers.37.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
291
+ "model.layers.37.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
292
+ "model.layers.37.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
293
+ "model.layers.37.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
294
+ "model.layers.37.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
295
+ "model.layers.38.input_layernorm.weight": "model-00002-of-00003.safetensors",
296
+ "model.layers.38.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
297
+ "model.layers.38.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
298
+ "model.layers.38.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
299
+ "model.layers.38.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
300
+ "model.layers.38.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
301
+ "model.layers.38.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
302
+ "model.layers.38.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
303
+ "model.layers.38.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
304
+ "model.layers.39.input_layernorm.weight": "model-00002-of-00003.safetensors",
305
+ "model.layers.39.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
306
+ "model.layers.39.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
307
+ "model.layers.39.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
308
+ "model.layers.39.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
309
+ "model.layers.39.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
310
+ "model.layers.39.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
311
+ "model.layers.39.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
312
+ "model.layers.39.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
313
+ "model.layers.4.input_layernorm.weight": "model-00001-of-00003.safetensors",
314
+ "model.layers.4.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
315
+ "model.layers.4.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
316
+ "model.layers.4.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
317
+ "model.layers.4.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
318
+ "model.layers.4.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
319
+ "model.layers.4.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
320
+ "model.layers.4.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
321
+ "model.layers.4.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
322
+ "model.layers.40.input_layernorm.weight": "model-00002-of-00003.safetensors",
323
+ "model.layers.40.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
324
+ "model.layers.40.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
325
+ "model.layers.40.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
326
+ "model.layers.40.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
327
+ "model.layers.40.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
328
+ "model.layers.40.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
329
+ "model.layers.40.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
330
+ "model.layers.40.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
331
+ "model.layers.41.input_layernorm.weight": "model-00002-of-00003.safetensors",
332
+ "model.layers.41.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
333
+ "model.layers.41.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
334
+ "model.layers.41.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
335
+ "model.layers.41.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
336
+ "model.layers.41.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
337
+ "model.layers.41.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
338
+ "model.layers.41.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
339
+ "model.layers.41.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
340
+ "model.layers.42.input_layernorm.weight": "model-00002-of-00003.safetensors",
341
+ "model.layers.42.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
342
+ "model.layers.42.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
343
+ "model.layers.42.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
344
+ "model.layers.42.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
345
+ "model.layers.42.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
346
+ "model.layers.42.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
347
+ "model.layers.42.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
348
+ "model.layers.42.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
349
+ "model.layers.43.input_layernorm.weight": "model-00002-of-00003.safetensors",
350
+ "model.layers.43.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
351
+ "model.layers.43.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
352
+ "model.layers.43.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
353
+ "model.layers.43.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
354
+ "model.layers.43.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
355
+ "model.layers.43.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
356
+ "model.layers.43.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
357
+ "model.layers.43.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
358
+ "model.layers.44.input_layernorm.weight": "model-00002-of-00003.safetensors",
359
+ "model.layers.44.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
360
+ "model.layers.44.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
361
+ "model.layers.44.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
362
+ "model.layers.44.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
363
+ "model.layers.44.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
364
+ "model.layers.44.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
365
+ "model.layers.44.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
366
+ "model.layers.44.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
367
+ "model.layers.45.input_layernorm.weight": "model-00003-of-00003.safetensors",
368
+ "model.layers.45.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
369
+ "model.layers.45.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
370
+ "model.layers.45.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
371
+ "model.layers.45.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
372
+ "model.layers.45.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
373
+ "model.layers.45.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
374
+ "model.layers.45.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
375
+ "model.layers.45.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
376
+ "model.layers.46.input_layernorm.weight": "model-00003-of-00003.safetensors",
377
+ "model.layers.46.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
378
+ "model.layers.46.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
379
+ "model.layers.46.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
380
+ "model.layers.46.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
381
+ "model.layers.46.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
382
+ "model.layers.46.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
383
+ "model.layers.46.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
384
+ "model.layers.46.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
385
+ "model.layers.5.input_layernorm.weight": "model-00001-of-00003.safetensors",
386
+ "model.layers.5.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
387
+ "model.layers.5.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
388
+ "model.layers.5.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
389
+ "model.layers.5.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
390
+ "model.layers.5.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
391
+ "model.layers.5.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
392
+ "model.layers.5.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
393
+ "model.layers.5.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
394
+ "model.layers.6.input_layernorm.weight": "model-00001-of-00003.safetensors",
395
+ "model.layers.6.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
396
+ "model.layers.6.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
397
+ "model.layers.6.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
398
+ "model.layers.6.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
399
+ "model.layers.6.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
400
+ "model.layers.6.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
401
+ "model.layers.6.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
402
+ "model.layers.6.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
403
+ "model.layers.7.input_layernorm.weight": "model-00001-of-00003.safetensors",
404
+ "model.layers.7.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
405
+ "model.layers.7.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
406
+ "model.layers.7.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
407
+ "model.layers.7.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
408
+ "model.layers.7.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
409
+ "model.layers.7.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
410
+ "model.layers.7.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
411
+ "model.layers.7.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
412
+ "model.layers.8.input_layernorm.weight": "model-00001-of-00003.safetensors",
413
+ "model.layers.8.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
414
+ "model.layers.8.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
415
+ "model.layers.8.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
416
+ "model.layers.8.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
417
+ "model.layers.8.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
418
+ "model.layers.8.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
419
+ "model.layers.8.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
420
+ "model.layers.8.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
421
+ "model.layers.9.input_layernorm.weight": "model-00001-of-00003.safetensors",
422
+ "model.layers.9.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
423
+ "model.layers.9.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
424
+ "model.layers.9.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
425
+ "model.layers.9.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
426
+ "model.layers.9.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
427
+ "model.layers.9.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
428
+ "model.layers.9.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
429
+ "model.layers.9.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
430
+ "model.norm.weight": "model-00003-of-00003.safetensors"
431
+ }
432
+ }
q4_k_m.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:775f046d039e153c8d54dd881775f3628c71cb53406455ca466587353f811902
3
+ size 3161385792
special_tokens_map.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|begin_of_text|>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|eot_id|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "<|finetune_right_pad_id|>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ }
23
+ }
tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6b9e4e7fb171f92fd137b777cc2714bf87d11576700a1dcd7a399e7bbe39537b
3
+ size 17209920
tokenizer_config.json ADDED
@@ -0,0 +1,2064 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "128000": {
4
+ "content": "<|begin_of_text|>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "128001": {
12
+ "content": "<|end_of_text|>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "128002": {
20
+ "content": "<|reserved_special_token_0|>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "128003": {
28
+ "content": "<|reserved_special_token_1|>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "128004": {
36
+ "content": "<|finetune_right_pad_id|>",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ },
43
+ "128005": {
44
+ "content": "<|reserved_special_token_2|>",
45
+ "lstrip": false,
46
+ "normalized": false,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": true
50
+ },
51
+ "128006": {
52
+ "content": "<|start_header_id|>",
53
+ "lstrip": false,
54
+ "normalized": false,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": true
58
+ },
59
+ "128007": {
60
+ "content": "<|end_header_id|>",
61
+ "lstrip": false,
62
+ "normalized": false,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": true
66
+ },
67
+ "128008": {
68
+ "content": "<|eom_id|>",
69
+ "lstrip": false,
70
+ "normalized": false,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": true
74
+ },
75
+ "128009": {
76
+ "content": "<|eot_id|>",
77
+ "lstrip": false,
78
+ "normalized": false,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": true
82
+ },
83
+ "128010": {
84
+ "content": "<|python_tag|>",
85
+ "lstrip": false,
86
+ "normalized": false,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": true
90
+ },
91
+ "128011": {
92
+ "content": "<|reserved_special_token_3|>",
93
+ "lstrip": false,
94
+ "normalized": false,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": true
98
+ },
99
+ "128012": {
100
+ "content": "<|reserved_special_token_4|>",
101
+ "lstrip": false,
102
+ "normalized": false,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": true
106
+ },
107
+ "128013": {
108
+ "content": "<|reserved_special_token_5|>",
109
+ "lstrip": false,
110
+ "normalized": false,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": true
114
+ },
115
+ "128014": {
116
+ "content": "<|reserved_special_token_6|>",
117
+ "lstrip": false,
118
+ "normalized": false,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": true
122
+ },
123
+ "128015": {
124
+ "content": "<|reserved_special_token_7|>",
125
+ "lstrip": false,
126
+ "normalized": false,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": true
130
+ },
131
+ "128016": {
132
+ "content": "<|reserved_special_token_8|>",
133
+ "lstrip": false,
134
+ "normalized": false,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": true
138
+ },
139
+ "128017": {
140
+ "content": "<|reserved_special_token_9|>",
141
+ "lstrip": false,
142
+ "normalized": false,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": true
146
+ },
147
+ "128018": {
148
+ "content": "<|reserved_special_token_10|>",
149
+ "lstrip": false,
150
+ "normalized": false,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": true
154
+ },
155
+ "128019": {
156
+ "content": "<|reserved_special_token_11|>",
157
+ "lstrip": false,
158
+ "normalized": false,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": true
162
+ },
163
+ "128020": {
164
+ "content": "<|reserved_special_token_12|>",
165
+ "lstrip": false,
166
+ "normalized": false,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": true
170
+ },
171
+ "128021": {
172
+ "content": "<|reserved_special_token_13|>",
173
+ "lstrip": false,
174
+ "normalized": false,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": true
178
+ },
179
+ "128022": {
180
+ "content": "<|reserved_special_token_14|>",
181
+ "lstrip": false,
182
+ "normalized": false,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": true
186
+ },
187
+ "128023": {
188
+ "content": "<|reserved_special_token_15|>",
189
+ "lstrip": false,
190
+ "normalized": false,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": true
194
+ },
195
+ "128024": {
196
+ "content": "<|reserved_special_token_16|>",
197
+ "lstrip": false,
198
+ "normalized": false,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": true
202
+ },
203
+ "128025": {
204
+ "content": "<|reserved_special_token_17|>",
205
+ "lstrip": false,
206
+ "normalized": false,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": true
210
+ },
211
+ "128026": {
212
+ "content": "<|reserved_special_token_18|>",
213
+ "lstrip": false,
214
+ "normalized": false,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": true
218
+ },
219
+ "128027": {
220
+ "content": "<|reserved_special_token_19|>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "128028": {
228
+ "content": "<|reserved_special_token_20|>",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "128029": {
236
+ "content": "<|reserved_special_token_21|>",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "128030": {
244
+ "content": "<|reserved_special_token_22|>",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "128031": {
252
+ "content": "<|reserved_special_token_23|>",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "128032": {
260
+ "content": "<|reserved_special_token_24|>",
261
+ "lstrip": false,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "128033": {
268
+ "content": "<|reserved_special_token_25|>",
269
+ "lstrip": false,
270
+ "normalized": false,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": true
274
+ },
275
+ "128034": {
276
+ "content": "<|reserved_special_token_26|>",
277
+ "lstrip": false,
278
+ "normalized": false,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": true
282
+ },
283
+ "128035": {
284
+ "content": "<|reserved_special_token_27|>",
285
+ "lstrip": false,
286
+ "normalized": false,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": true
290
+ },
291
+ "128036": {
292
+ "content": "<|reserved_special_token_28|>",
293
+ "lstrip": false,
294
+ "normalized": false,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": true
298
+ },
299
+ "128037": {
300
+ "content": "<|reserved_special_token_29|>",
301
+ "lstrip": false,
302
+ "normalized": false,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": true
306
+ },
307
+ "128038": {
308
+ "content": "<|reserved_special_token_30|>",
309
+ "lstrip": false,
310
+ "normalized": false,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": true
314
+ },
315
+ "128039": {
316
+ "content": "<|reserved_special_token_31|>",
317
+ "lstrip": false,
318
+ "normalized": false,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": true
322
+ },
323
+ "128040": {
324
+ "content": "<|reserved_special_token_32|>",
325
+ "lstrip": false,
326
+ "normalized": false,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": true
330
+ },
331
+ "128041": {
332
+ "content": "<|reserved_special_token_33|>",
333
+ "lstrip": false,
334
+ "normalized": false,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": true
338
+ },
339
+ "128042": {
340
+ "content": "<|reserved_special_token_34|>",
341
+ "lstrip": false,
342
+ "normalized": false,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": true
346
+ },
347
+ "128043": {
348
+ "content": "<|reserved_special_token_35|>",
349
+ "lstrip": false,
350
+ "normalized": false,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": true
354
+ },
355
+ "128044": {
356
+ "content": "<|reserved_special_token_36|>",
357
+ "lstrip": false,
358
+ "normalized": false,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": true
362
+ },
363
+ "128045": {
364
+ "content": "<|reserved_special_token_37|>",
365
+ "lstrip": false,
366
+ "normalized": false,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": true
370
+ },
371
+ "128046": {
372
+ "content": "<|reserved_special_token_38|>",
373
+ "lstrip": false,
374
+ "normalized": false,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": true
378
+ },
379
+ "128047": {
380
+ "content": "<|reserved_special_token_39|>",
381
+ "lstrip": false,
382
+ "normalized": false,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": true
386
+ },
387
+ "128048": {
388
+ "content": "<|reserved_special_token_40|>",
389
+ "lstrip": false,
390
+ "normalized": false,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": true
394
+ },
395
+ "128049": {
396
+ "content": "<|reserved_special_token_41|>",
397
+ "lstrip": false,
398
+ "normalized": false,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": true
402
+ },
403
+ "128050": {
404
+ "content": "<|reserved_special_token_42|>",
405
+ "lstrip": false,
406
+ "normalized": false,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": true
410
+ },
411
+ "128051": {
412
+ "content": "<|reserved_special_token_43|>",
413
+ "lstrip": false,
414
+ "normalized": false,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": true
418
+ },
419
+ "128052": {
420
+ "content": "<|reserved_special_token_44|>",
421
+ "lstrip": false,
422
+ "normalized": false,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": true
426
+ },
427
+ "128053": {
428
+ "content": "<|reserved_special_token_45|>",
429
+ "lstrip": false,
430
+ "normalized": false,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": true
434
+ },
435
+ "128054": {
436
+ "content": "<|reserved_special_token_46|>",
437
+ "lstrip": false,
438
+ "normalized": false,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": true
442
+ },
443
+ "128055": {
444
+ "content": "<|reserved_special_token_47|>",
445
+ "lstrip": false,
446
+ "normalized": false,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": true
450
+ },
451
+ "128056": {
452
+ "content": "<|reserved_special_token_48|>",
453
+ "lstrip": false,
454
+ "normalized": false,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": true
458
+ },
459
+ "128057": {
460
+ "content": "<|reserved_special_token_49|>",
461
+ "lstrip": false,
462
+ "normalized": false,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": true
466
+ },
467
+ "128058": {
468
+ "content": "<|reserved_special_token_50|>",
469
+ "lstrip": false,
470
+ "normalized": false,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": true
474
+ },
475
+ "128059": {
476
+ "content": "<|reserved_special_token_51|>",
477
+ "lstrip": false,
478
+ "normalized": false,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": true
482
+ },
483
+ "128060": {
484
+ "content": "<|reserved_special_token_52|>",
485
+ "lstrip": false,
486
+ "normalized": false,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": true
490
+ },
491
+ "128061": {
492
+ "content": "<|reserved_special_token_53|>",
493
+ "lstrip": false,
494
+ "normalized": false,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": true
498
+ },
499
+ "128062": {
500
+ "content": "<|reserved_special_token_54|>",
501
+ "lstrip": false,
502
+ "normalized": false,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": true
506
+ },
507
+ "128063": {
508
+ "content": "<|reserved_special_token_55|>",
509
+ "lstrip": false,
510
+ "normalized": false,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": true
514
+ },
515
+ "128064": {
516
+ "content": "<|reserved_special_token_56|>",
517
+ "lstrip": false,
518
+ "normalized": false,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": true
522
+ },
523
+ "128065": {
524
+ "content": "<|reserved_special_token_57|>",
525
+ "lstrip": false,
526
+ "normalized": false,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": true
530
+ },
531
+ "128066": {
532
+ "content": "<|reserved_special_token_58|>",
533
+ "lstrip": false,
534
+ "normalized": false,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": true
538
+ },
539
+ "128067": {
540
+ "content": "<|reserved_special_token_59|>",
541
+ "lstrip": false,
542
+ "normalized": false,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": true
546
+ },
547
+ "128068": {
548
+ "content": "<|reserved_special_token_60|>",
549
+ "lstrip": false,
550
+ "normalized": false,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": true
554
+ },
555
+ "128069": {
556
+ "content": "<|reserved_special_token_61|>",
557
+ "lstrip": false,
558
+ "normalized": false,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": true
562
+ },
563
+ "128070": {
564
+ "content": "<|reserved_special_token_62|>",
565
+ "lstrip": false,
566
+ "normalized": false,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": true
570
+ },
571
+ "128071": {
572
+ "content": "<|reserved_special_token_63|>",
573
+ "lstrip": false,
574
+ "normalized": false,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": true
578
+ },
579
+ "128072": {
580
+ "content": "<|reserved_special_token_64|>",
581
+ "lstrip": false,
582
+ "normalized": false,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": true
586
+ },
587
+ "128073": {
588
+ "content": "<|reserved_special_token_65|>",
589
+ "lstrip": false,
590
+ "normalized": false,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": true
594
+ },
595
+ "128074": {
596
+ "content": "<|reserved_special_token_66|>",
597
+ "lstrip": false,
598
+ "normalized": false,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": true
602
+ },
603
+ "128075": {
604
+ "content": "<|reserved_special_token_67|>",
605
+ "lstrip": false,
606
+ "normalized": false,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": true
610
+ },
611
+ "128076": {
612
+ "content": "<|reserved_special_token_68|>",
613
+ "lstrip": false,
614
+ "normalized": false,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": true
618
+ },
619
+ "128077": {
620
+ "content": "<|reserved_special_token_69|>",
621
+ "lstrip": false,
622
+ "normalized": false,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": true
626
+ },
627
+ "128078": {
628
+ "content": "<|reserved_special_token_70|>",
629
+ "lstrip": false,
630
+ "normalized": false,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": true
634
+ },
635
+ "128079": {
636
+ "content": "<|reserved_special_token_71|>",
637
+ "lstrip": false,
638
+ "normalized": false,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": true
642
+ },
643
+ "128080": {
644
+ "content": "<|reserved_special_token_72|>",
645
+ "lstrip": false,
646
+ "normalized": false,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": true
650
+ },
651
+ "128081": {
652
+ "content": "<|reserved_special_token_73|>",
653
+ "lstrip": false,
654
+ "normalized": false,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": true
658
+ },
659
+ "128082": {
660
+ "content": "<|reserved_special_token_74|>",
661
+ "lstrip": false,
662
+ "normalized": false,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": true
666
+ },
667
+ "128083": {
668
+ "content": "<|reserved_special_token_75|>",
669
+ "lstrip": false,
670
+ "normalized": false,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": true
674
+ },
675
+ "128084": {
676
+ "content": "<|reserved_special_token_76|>",
677
+ "lstrip": false,
678
+ "normalized": false,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": true
682
+ },
683
+ "128085": {
684
+ "content": "<|reserved_special_token_77|>",
685
+ "lstrip": false,
686
+ "normalized": false,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": true
690
+ },
691
+ "128086": {
692
+ "content": "<|reserved_special_token_78|>",
693
+ "lstrip": false,
694
+ "normalized": false,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": true
698
+ },
699
+ "128087": {
700
+ "content": "<|reserved_special_token_79|>",
701
+ "lstrip": false,
702
+ "normalized": false,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": true
706
+ },
707
+ "128088": {
708
+ "content": "<|reserved_special_token_80|>",
709
+ "lstrip": false,
710
+ "normalized": false,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": true
714
+ },
715
+ "128089": {
716
+ "content": "<|reserved_special_token_81|>",
717
+ "lstrip": false,
718
+ "normalized": false,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": true
722
+ },
723
+ "128090": {
724
+ "content": "<|reserved_special_token_82|>",
725
+ "lstrip": false,
726
+ "normalized": false,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": true
730
+ },
731
+ "128091": {
732
+ "content": "<|reserved_special_token_83|>",
733
+ "lstrip": false,
734
+ "normalized": false,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": true
738
+ },
739
+ "128092": {
740
+ "content": "<|reserved_special_token_84|>",
741
+ "lstrip": false,
742
+ "normalized": false,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": true
746
+ },
747
+ "128093": {
748
+ "content": "<|reserved_special_token_85|>",
749
+ "lstrip": false,
750
+ "normalized": false,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": true
754
+ },
755
+ "128094": {
756
+ "content": "<|reserved_special_token_86|>",
757
+ "lstrip": false,
758
+ "normalized": false,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": true
762
+ },
763
+ "128095": {
764
+ "content": "<|reserved_special_token_87|>",
765
+ "lstrip": false,
766
+ "normalized": false,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": true
770
+ },
771
+ "128096": {
772
+ "content": "<|reserved_special_token_88|>",
773
+ "lstrip": false,
774
+ "normalized": false,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": true
778
+ },
779
+ "128097": {
780
+ "content": "<|reserved_special_token_89|>",
781
+ "lstrip": false,
782
+ "normalized": false,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": true
786
+ },
787
+ "128098": {
788
+ "content": "<|reserved_special_token_90|>",
789
+ "lstrip": false,
790
+ "normalized": false,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": true
794
+ },
795
+ "128099": {
796
+ "content": "<|reserved_special_token_91|>",
797
+ "lstrip": false,
798
+ "normalized": false,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": true
802
+ },
803
+ "128100": {
804
+ "content": "<|reserved_special_token_92|>",
805
+ "lstrip": false,
806
+ "normalized": false,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": true
810
+ },
811
+ "128101": {
812
+ "content": "<|reserved_special_token_93|>",
813
+ "lstrip": false,
814
+ "normalized": false,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": true
818
+ },
819
+ "128102": {
820
+ "content": "<|reserved_special_token_94|>",
821
+ "lstrip": false,
822
+ "normalized": false,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": true
826
+ },
827
+ "128103": {
828
+ "content": "<|reserved_special_token_95|>",
829
+ "lstrip": false,
830
+ "normalized": false,
831
+ "rstrip": false,
832
+ "single_word": false,
833
+ "special": true
834
+ },
835
+ "128104": {
836
+ "content": "<|reserved_special_token_96|>",
837
+ "lstrip": false,
838
+ "normalized": false,
839
+ "rstrip": false,
840
+ "single_word": false,
841
+ "special": true
842
+ },
843
+ "128105": {
844
+ "content": "<|reserved_special_token_97|>",
845
+ "lstrip": false,
846
+ "normalized": false,
847
+ "rstrip": false,
848
+ "single_word": false,
849
+ "special": true
850
+ },
851
+ "128106": {
852
+ "content": "<|reserved_special_token_98|>",
853
+ "lstrip": false,
854
+ "normalized": false,
855
+ "rstrip": false,
856
+ "single_word": false,
857
+ "special": true
858
+ },
859
+ "128107": {
860
+ "content": "<|reserved_special_token_99|>",
861
+ "lstrip": false,
862
+ "normalized": false,
863
+ "rstrip": false,
864
+ "single_word": false,
865
+ "special": true
866
+ },
867
+ "128108": {
868
+ "content": "<|reserved_special_token_100|>",
869
+ "lstrip": false,
870
+ "normalized": false,
871
+ "rstrip": false,
872
+ "single_word": false,
873
+ "special": true
874
+ },
875
+ "128109": {
876
+ "content": "<|reserved_special_token_101|>",
877
+ "lstrip": false,
878
+ "normalized": false,
879
+ "rstrip": false,
880
+ "single_word": false,
881
+ "special": true
882
+ },
883
+ "128110": {
884
+ "content": "<|reserved_special_token_102|>",
885
+ "lstrip": false,
886
+ "normalized": false,
887
+ "rstrip": false,
888
+ "single_word": false,
889
+ "special": true
890
+ },
891
+ "128111": {
892
+ "content": "<|reserved_special_token_103|>",
893
+ "lstrip": false,
894
+ "normalized": false,
895
+ "rstrip": false,
896
+ "single_word": false,
897
+ "special": true
898
+ },
899
+ "128112": {
900
+ "content": "<|reserved_special_token_104|>",
901
+ "lstrip": false,
902
+ "normalized": false,
903
+ "rstrip": false,
904
+ "single_word": false,
905
+ "special": true
906
+ },
907
+ "128113": {
908
+ "content": "<|reserved_special_token_105|>",
909
+ "lstrip": false,
910
+ "normalized": false,
911
+ "rstrip": false,
912
+ "single_word": false,
913
+ "special": true
914
+ },
915
+ "128114": {
916
+ "content": "<|reserved_special_token_106|>",
917
+ "lstrip": false,
918
+ "normalized": false,
919
+ "rstrip": false,
920
+ "single_word": false,
921
+ "special": true
922
+ },
923
+ "128115": {
924
+ "content": "<|reserved_special_token_107|>",
925
+ "lstrip": false,
926
+ "normalized": false,
927
+ "rstrip": false,
928
+ "single_word": false,
929
+ "special": true
930
+ },
931
+ "128116": {
932
+ "content": "<|reserved_special_token_108|>",
933
+ "lstrip": false,
934
+ "normalized": false,
935
+ "rstrip": false,
936
+ "single_word": false,
937
+ "special": true
938
+ },
939
+ "128117": {
940
+ "content": "<|reserved_special_token_109|>",
941
+ "lstrip": false,
942
+ "normalized": false,
943
+ "rstrip": false,
944
+ "single_word": false,
945
+ "special": true
946
+ },
947
+ "128118": {
948
+ "content": "<|reserved_special_token_110|>",
949
+ "lstrip": false,
950
+ "normalized": false,
951
+ "rstrip": false,
952
+ "single_word": false,
953
+ "special": true
954
+ },
955
+ "128119": {
956
+ "content": "<|reserved_special_token_111|>",
957
+ "lstrip": false,
958
+ "normalized": false,
959
+ "rstrip": false,
960
+ "single_word": false,
961
+ "special": true
962
+ },
963
+ "128120": {
964
+ "content": "<|reserved_special_token_112|>",
965
+ "lstrip": false,
966
+ "normalized": false,
967
+ "rstrip": false,
968
+ "single_word": false,
969
+ "special": true
970
+ },
971
+ "128121": {
972
+ "content": "<|reserved_special_token_113|>",
973
+ "lstrip": false,
974
+ "normalized": false,
975
+ "rstrip": false,
976
+ "single_word": false,
977
+ "special": true
978
+ },
979
+ "128122": {
980
+ "content": "<|reserved_special_token_114|>",
981
+ "lstrip": false,
982
+ "normalized": false,
983
+ "rstrip": false,
984
+ "single_word": false,
985
+ "special": true
986
+ },
987
+ "128123": {
988
+ "content": "<|reserved_special_token_115|>",
989
+ "lstrip": false,
990
+ "normalized": false,
991
+ "rstrip": false,
992
+ "single_word": false,
993
+ "special": true
994
+ },
995
+ "128124": {
996
+ "content": "<|reserved_special_token_116|>",
997
+ "lstrip": false,
998
+ "normalized": false,
999
+ "rstrip": false,
1000
+ "single_word": false,
1001
+ "special": true
1002
+ },
1003
+ "128125": {
1004
+ "content": "<|reserved_special_token_117|>",
1005
+ "lstrip": false,
1006
+ "normalized": false,
1007
+ "rstrip": false,
1008
+ "single_word": false,
1009
+ "special": true
1010
+ },
1011
+ "128126": {
1012
+ "content": "<|reserved_special_token_118|>",
1013
+ "lstrip": false,
1014
+ "normalized": false,
1015
+ "rstrip": false,
1016
+ "single_word": false,
1017
+ "special": true
1018
+ },
1019
+ "128127": {
1020
+ "content": "<|reserved_special_token_119|>",
1021
+ "lstrip": false,
1022
+ "normalized": false,
1023
+ "rstrip": false,
1024
+ "single_word": false,
1025
+ "special": true
1026
+ },
1027
+ "128128": {
1028
+ "content": "<|reserved_special_token_120|>",
1029
+ "lstrip": false,
1030
+ "normalized": false,
1031
+ "rstrip": false,
1032
+ "single_word": false,
1033
+ "special": true
1034
+ },
1035
+ "128129": {
1036
+ "content": "<|reserved_special_token_121|>",
1037
+ "lstrip": false,
1038
+ "normalized": false,
1039
+ "rstrip": false,
1040
+ "single_word": false,
1041
+ "special": true
1042
+ },
1043
+ "128130": {
1044
+ "content": "<|reserved_special_token_122|>",
1045
+ "lstrip": false,
1046
+ "normalized": false,
1047
+ "rstrip": false,
1048
+ "single_word": false,
1049
+ "special": true
1050
+ },
1051
+ "128131": {
1052
+ "content": "<|reserved_special_token_123|>",
1053
+ "lstrip": false,
1054
+ "normalized": false,
1055
+ "rstrip": false,
1056
+ "single_word": false,
1057
+ "special": true
1058
+ },
1059
+ "128132": {
1060
+ "content": "<|reserved_special_token_124|>",
1061
+ "lstrip": false,
1062
+ "normalized": false,
1063
+ "rstrip": false,
1064
+ "single_word": false,
1065
+ "special": true
1066
+ },
1067
+ "128133": {
1068
+ "content": "<|reserved_special_token_125|>",
1069
+ "lstrip": false,
1070
+ "normalized": false,
1071
+ "rstrip": false,
1072
+ "single_word": false,
1073
+ "special": true
1074
+ },
1075
+ "128134": {
1076
+ "content": "<|reserved_special_token_126|>",
1077
+ "lstrip": false,
1078
+ "normalized": false,
1079
+ "rstrip": false,
1080
+ "single_word": false,
1081
+ "special": true
1082
+ },
1083
+ "128135": {
1084
+ "content": "<|reserved_special_token_127|>",
1085
+ "lstrip": false,
1086
+ "normalized": false,
1087
+ "rstrip": false,
1088
+ "single_word": false,
1089
+ "special": true
1090
+ },
1091
+ "128136": {
1092
+ "content": "<|reserved_special_token_128|>",
1093
+ "lstrip": false,
1094
+ "normalized": false,
1095
+ "rstrip": false,
1096
+ "single_word": false,
1097
+ "special": true
1098
+ },
1099
+ "128137": {
1100
+ "content": "<|reserved_special_token_129|>",
1101
+ "lstrip": false,
1102
+ "normalized": false,
1103
+ "rstrip": false,
1104
+ "single_word": false,
1105
+ "special": true
1106
+ },
1107
+ "128138": {
1108
+ "content": "<|reserved_special_token_130|>",
1109
+ "lstrip": false,
1110
+ "normalized": false,
1111
+ "rstrip": false,
1112
+ "single_word": false,
1113
+ "special": true
1114
+ },
1115
+ "128139": {
1116
+ "content": "<|reserved_special_token_131|>",
1117
+ "lstrip": false,
1118
+ "normalized": false,
1119
+ "rstrip": false,
1120
+ "single_word": false,
1121
+ "special": true
1122
+ },
1123
+ "128140": {
1124
+ "content": "<|reserved_special_token_132|>",
1125
+ "lstrip": false,
1126
+ "normalized": false,
1127
+ "rstrip": false,
1128
+ "single_word": false,
1129
+ "special": true
1130
+ },
1131
+ "128141": {
1132
+ "content": "<|reserved_special_token_133|>",
1133
+ "lstrip": false,
1134
+ "normalized": false,
1135
+ "rstrip": false,
1136
+ "single_word": false,
1137
+ "special": true
1138
+ },
1139
+ "128142": {
1140
+ "content": "<|reserved_special_token_134|>",
1141
+ "lstrip": false,
1142
+ "normalized": false,
1143
+ "rstrip": false,
1144
+ "single_word": false,
1145
+ "special": true
1146
+ },
1147
+ "128143": {
1148
+ "content": "<|reserved_special_token_135|>",
1149
+ "lstrip": false,
1150
+ "normalized": false,
1151
+ "rstrip": false,
1152
+ "single_word": false,
1153
+ "special": true
1154
+ },
1155
+ "128144": {
1156
+ "content": "<|reserved_special_token_136|>",
1157
+ "lstrip": false,
1158
+ "normalized": false,
1159
+ "rstrip": false,
1160
+ "single_word": false,
1161
+ "special": true
1162
+ },
1163
+ "128145": {
1164
+ "content": "<|reserved_special_token_137|>",
1165
+ "lstrip": false,
1166
+ "normalized": false,
1167
+ "rstrip": false,
1168
+ "single_word": false,
1169
+ "special": true
1170
+ },
1171
+ "128146": {
1172
+ "content": "<|reserved_special_token_138|>",
1173
+ "lstrip": false,
1174
+ "normalized": false,
1175
+ "rstrip": false,
1176
+ "single_word": false,
1177
+ "special": true
1178
+ },
1179
+ "128147": {
1180
+ "content": "<|reserved_special_token_139|>",
1181
+ "lstrip": false,
1182
+ "normalized": false,
1183
+ "rstrip": false,
1184
+ "single_word": false,
1185
+ "special": true
1186
+ },
1187
+ "128148": {
1188
+ "content": "<|reserved_special_token_140|>",
1189
+ "lstrip": false,
1190
+ "normalized": false,
1191
+ "rstrip": false,
1192
+ "single_word": false,
1193
+ "special": true
1194
+ },
1195
+ "128149": {
1196
+ "content": "<|reserved_special_token_141|>",
1197
+ "lstrip": false,
1198
+ "normalized": false,
1199
+ "rstrip": false,
1200
+ "single_word": false,
1201
+ "special": true
1202
+ },
1203
+ "128150": {
1204
+ "content": "<|reserved_special_token_142|>",
1205
+ "lstrip": false,
1206
+ "normalized": false,
1207
+ "rstrip": false,
1208
+ "single_word": false,
1209
+ "special": true
1210
+ },
1211
+ "128151": {
1212
+ "content": "<|reserved_special_token_143|>",
1213
+ "lstrip": false,
1214
+ "normalized": false,
1215
+ "rstrip": false,
1216
+ "single_word": false,
1217
+ "special": true
1218
+ },
1219
+ "128152": {
1220
+ "content": "<|reserved_special_token_144|>",
1221
+ "lstrip": false,
1222
+ "normalized": false,
1223
+ "rstrip": false,
1224
+ "single_word": false,
1225
+ "special": true
1226
+ },
1227
+ "128153": {
1228
+ "content": "<|reserved_special_token_145|>",
1229
+ "lstrip": false,
1230
+ "normalized": false,
1231
+ "rstrip": false,
1232
+ "single_word": false,
1233
+ "special": true
1234
+ },
1235
+ "128154": {
1236
+ "content": "<|reserved_special_token_146|>",
1237
+ "lstrip": false,
1238
+ "normalized": false,
1239
+ "rstrip": false,
1240
+ "single_word": false,
1241
+ "special": true
1242
+ },
1243
+ "128155": {
1244
+ "content": "<|reserved_special_token_147|>",
1245
+ "lstrip": false,
1246
+ "normalized": false,
1247
+ "rstrip": false,
1248
+ "single_word": false,
1249
+ "special": true
1250
+ },
1251
+ "128156": {
1252
+ "content": "<|reserved_special_token_148|>",
1253
+ "lstrip": false,
1254
+ "normalized": false,
1255
+ "rstrip": false,
1256
+ "single_word": false,
1257
+ "special": true
1258
+ },
1259
+ "128157": {
1260
+ "content": "<|reserved_special_token_149|>",
1261
+ "lstrip": false,
1262
+ "normalized": false,
1263
+ "rstrip": false,
1264
+ "single_word": false,
1265
+ "special": true
1266
+ },
1267
+ "128158": {
1268
+ "content": "<|reserved_special_token_150|>",
1269
+ "lstrip": false,
1270
+ "normalized": false,
1271
+ "rstrip": false,
1272
+ "single_word": false,
1273
+ "special": true
1274
+ },
1275
+ "128159": {
1276
+ "content": "<|reserved_special_token_151|>",
1277
+ "lstrip": false,
1278
+ "normalized": false,
1279
+ "rstrip": false,
1280
+ "single_word": false,
1281
+ "special": true
1282
+ },
1283
+ "128160": {
1284
+ "content": "<|reserved_special_token_152|>",
1285
+ "lstrip": false,
1286
+ "normalized": false,
1287
+ "rstrip": false,
1288
+ "single_word": false,
1289
+ "special": true
1290
+ },
1291
+ "128161": {
1292
+ "content": "<|reserved_special_token_153|>",
1293
+ "lstrip": false,
1294
+ "normalized": false,
1295
+ "rstrip": false,
1296
+ "single_word": false,
1297
+ "special": true
1298
+ },
1299
+ "128162": {
1300
+ "content": "<|reserved_special_token_154|>",
1301
+ "lstrip": false,
1302
+ "normalized": false,
1303
+ "rstrip": false,
1304
+ "single_word": false,
1305
+ "special": true
1306
+ },
1307
+ "128163": {
1308
+ "content": "<|reserved_special_token_155|>",
1309
+ "lstrip": false,
1310
+ "normalized": false,
1311
+ "rstrip": false,
1312
+ "single_word": false,
1313
+ "special": true
1314
+ },
1315
+ "128164": {
1316
+ "content": "<|reserved_special_token_156|>",
1317
+ "lstrip": false,
1318
+ "normalized": false,
1319
+ "rstrip": false,
1320
+ "single_word": false,
1321
+ "special": true
1322
+ },
1323
+ "128165": {
1324
+ "content": "<|reserved_special_token_157|>",
1325
+ "lstrip": false,
1326
+ "normalized": false,
1327
+ "rstrip": false,
1328
+ "single_word": false,
1329
+ "special": true
1330
+ },
1331
+ "128166": {
1332
+ "content": "<|reserved_special_token_158|>",
1333
+ "lstrip": false,
1334
+ "normalized": false,
1335
+ "rstrip": false,
1336
+ "single_word": false,
1337
+ "special": true
1338
+ },
1339
+ "128167": {
1340
+ "content": "<|reserved_special_token_159|>",
1341
+ "lstrip": false,
1342
+ "normalized": false,
1343
+ "rstrip": false,
1344
+ "single_word": false,
1345
+ "special": true
1346
+ },
1347
+ "128168": {
1348
+ "content": "<|reserved_special_token_160|>",
1349
+ "lstrip": false,
1350
+ "normalized": false,
1351
+ "rstrip": false,
1352
+ "single_word": false,
1353
+ "special": true
1354
+ },
1355
+ "128169": {
1356
+ "content": "<|reserved_special_token_161|>",
1357
+ "lstrip": false,
1358
+ "normalized": false,
1359
+ "rstrip": false,
1360
+ "single_word": false,
1361
+ "special": true
1362
+ },
1363
+ "128170": {
1364
+ "content": "<|reserved_special_token_162|>",
1365
+ "lstrip": false,
1366
+ "normalized": false,
1367
+ "rstrip": false,
1368
+ "single_word": false,
1369
+ "special": true
1370
+ },
1371
+ "128171": {
1372
+ "content": "<|reserved_special_token_163|>",
1373
+ "lstrip": false,
1374
+ "normalized": false,
1375
+ "rstrip": false,
1376
+ "single_word": false,
1377
+ "special": true
1378
+ },
1379
+ "128172": {
1380
+ "content": "<|reserved_special_token_164|>",
1381
+ "lstrip": false,
1382
+ "normalized": false,
1383
+ "rstrip": false,
1384
+ "single_word": false,
1385
+ "special": true
1386
+ },
1387
+ "128173": {
1388
+ "content": "<|reserved_special_token_165|>",
1389
+ "lstrip": false,
1390
+ "normalized": false,
1391
+ "rstrip": false,
1392
+ "single_word": false,
1393
+ "special": true
1394
+ },
1395
+ "128174": {
1396
+ "content": "<|reserved_special_token_166|>",
1397
+ "lstrip": false,
1398
+ "normalized": false,
1399
+ "rstrip": false,
1400
+ "single_word": false,
1401
+ "special": true
1402
+ },
1403
+ "128175": {
1404
+ "content": "<|reserved_special_token_167|>",
1405
+ "lstrip": false,
1406
+ "normalized": false,
1407
+ "rstrip": false,
1408
+ "single_word": false,
1409
+ "special": true
1410
+ },
1411
+ "128176": {
1412
+ "content": "<|reserved_special_token_168|>",
1413
+ "lstrip": false,
1414
+ "normalized": false,
1415
+ "rstrip": false,
1416
+ "single_word": false,
1417
+ "special": true
1418
+ },
1419
+ "128177": {
1420
+ "content": "<|reserved_special_token_169|>",
1421
+ "lstrip": false,
1422
+ "normalized": false,
1423
+ "rstrip": false,
1424
+ "single_word": false,
1425
+ "special": true
1426
+ },
1427
+ "128178": {
1428
+ "content": "<|reserved_special_token_170|>",
1429
+ "lstrip": false,
1430
+ "normalized": false,
1431
+ "rstrip": false,
1432
+ "single_word": false,
1433
+ "special": true
1434
+ },
1435
+ "128179": {
1436
+ "content": "<|reserved_special_token_171|>",
1437
+ "lstrip": false,
1438
+ "normalized": false,
1439
+ "rstrip": false,
1440
+ "single_word": false,
1441
+ "special": true
1442
+ },
1443
+ "128180": {
1444
+ "content": "<|reserved_special_token_172|>",
1445
+ "lstrip": false,
1446
+ "normalized": false,
1447
+ "rstrip": false,
1448
+ "single_word": false,
1449
+ "special": true
1450
+ },
1451
+ "128181": {
1452
+ "content": "<|reserved_special_token_173|>",
1453
+ "lstrip": false,
1454
+ "normalized": false,
1455
+ "rstrip": false,
1456
+ "single_word": false,
1457
+ "special": true
1458
+ },
1459
+ "128182": {
1460
+ "content": "<|reserved_special_token_174|>",
1461
+ "lstrip": false,
1462
+ "normalized": false,
1463
+ "rstrip": false,
1464
+ "single_word": false,
1465
+ "special": true
1466
+ },
1467
+ "128183": {
1468
+ "content": "<|reserved_special_token_175|>",
1469
+ "lstrip": false,
1470
+ "normalized": false,
1471
+ "rstrip": false,
1472
+ "single_word": false,
1473
+ "special": true
1474
+ },
1475
+ "128184": {
1476
+ "content": "<|reserved_special_token_176|>",
1477
+ "lstrip": false,
1478
+ "normalized": false,
1479
+ "rstrip": false,
1480
+ "single_word": false,
1481
+ "special": true
1482
+ },
1483
+ "128185": {
1484
+ "content": "<|reserved_special_token_177|>",
1485
+ "lstrip": false,
1486
+ "normalized": false,
1487
+ "rstrip": false,
1488
+ "single_word": false,
1489
+ "special": true
1490
+ },
1491
+ "128186": {
1492
+ "content": "<|reserved_special_token_178|>",
1493
+ "lstrip": false,
1494
+ "normalized": false,
1495
+ "rstrip": false,
1496
+ "single_word": false,
1497
+ "special": true
1498
+ },
1499
+ "128187": {
1500
+ "content": "<|reserved_special_token_179|>",
1501
+ "lstrip": false,
1502
+ "normalized": false,
1503
+ "rstrip": false,
1504
+ "single_word": false,
1505
+ "special": true
1506
+ },
1507
+ "128188": {
1508
+ "content": "<|reserved_special_token_180|>",
1509
+ "lstrip": false,
1510
+ "normalized": false,
1511
+ "rstrip": false,
1512
+ "single_word": false,
1513
+ "special": true
1514
+ },
1515
+ "128189": {
1516
+ "content": "<|reserved_special_token_181|>",
1517
+ "lstrip": false,
1518
+ "normalized": false,
1519
+ "rstrip": false,
1520
+ "single_word": false,
1521
+ "special": true
1522
+ },
1523
+ "128190": {
1524
+ "content": "<|reserved_special_token_182|>",
1525
+ "lstrip": false,
1526
+ "normalized": false,
1527
+ "rstrip": false,
1528
+ "single_word": false,
1529
+ "special": true
1530
+ },
1531
+ "128191": {
1532
+ "content": "<|reserved_special_token_183|>",
1533
+ "lstrip": false,
1534
+ "normalized": false,
1535
+ "rstrip": false,
1536
+ "single_word": false,
1537
+ "special": true
1538
+ },
1539
+ "128192": {
1540
+ "content": "<|reserved_special_token_184|>",
1541
+ "lstrip": false,
1542
+ "normalized": false,
1543
+ "rstrip": false,
1544
+ "single_word": false,
1545
+ "special": true
1546
+ },
1547
+ "128193": {
1548
+ "content": "<|reserved_special_token_185|>",
1549
+ "lstrip": false,
1550
+ "normalized": false,
1551
+ "rstrip": false,
1552
+ "single_word": false,
1553
+ "special": true
1554
+ },
1555
+ "128194": {
1556
+ "content": "<|reserved_special_token_186|>",
1557
+ "lstrip": false,
1558
+ "normalized": false,
1559
+ "rstrip": false,
1560
+ "single_word": false,
1561
+ "special": true
1562
+ },
1563
+ "128195": {
1564
+ "content": "<|reserved_special_token_187|>",
1565
+ "lstrip": false,
1566
+ "normalized": false,
1567
+ "rstrip": false,
1568
+ "single_word": false,
1569
+ "special": true
1570
+ },
1571
+ "128196": {
1572
+ "content": "<|reserved_special_token_188|>",
1573
+ "lstrip": false,
1574
+ "normalized": false,
1575
+ "rstrip": false,
1576
+ "single_word": false,
1577
+ "special": true
1578
+ },
1579
+ "128197": {
1580
+ "content": "<|reserved_special_token_189|>",
1581
+ "lstrip": false,
1582
+ "normalized": false,
1583
+ "rstrip": false,
1584
+ "single_word": false,
1585
+ "special": true
1586
+ },
1587
+ "128198": {
1588
+ "content": "<|reserved_special_token_190|>",
1589
+ "lstrip": false,
1590
+ "normalized": false,
1591
+ "rstrip": false,
1592
+ "single_word": false,
1593
+ "special": true
1594
+ },
1595
+ "128199": {
1596
+ "content": "<|reserved_special_token_191|>",
1597
+ "lstrip": false,
1598
+ "normalized": false,
1599
+ "rstrip": false,
1600
+ "single_word": false,
1601
+ "special": true
1602
+ },
1603
+ "128200": {
1604
+ "content": "<|reserved_special_token_192|>",
1605
+ "lstrip": false,
1606
+ "normalized": false,
1607
+ "rstrip": false,
1608
+ "single_word": false,
1609
+ "special": true
1610
+ },
1611
+ "128201": {
1612
+ "content": "<|reserved_special_token_193|>",
1613
+ "lstrip": false,
1614
+ "normalized": false,
1615
+ "rstrip": false,
1616
+ "single_word": false,
1617
+ "special": true
1618
+ },
1619
+ "128202": {
1620
+ "content": "<|reserved_special_token_194|>",
1621
+ "lstrip": false,
1622
+ "normalized": false,
1623
+ "rstrip": false,
1624
+ "single_word": false,
1625
+ "special": true
1626
+ },
1627
+ "128203": {
1628
+ "content": "<|reserved_special_token_195|>",
1629
+ "lstrip": false,
1630
+ "normalized": false,
1631
+ "rstrip": false,
1632
+ "single_word": false,
1633
+ "special": true
1634
+ },
1635
+ "128204": {
1636
+ "content": "<|reserved_special_token_196|>",
1637
+ "lstrip": false,
1638
+ "normalized": false,
1639
+ "rstrip": false,
1640
+ "single_word": false,
1641
+ "special": true
1642
+ },
1643
+ "128205": {
1644
+ "content": "<|reserved_special_token_197|>",
1645
+ "lstrip": false,
1646
+ "normalized": false,
1647
+ "rstrip": false,
1648
+ "single_word": false,
1649
+ "special": true
1650
+ },
1651
+ "128206": {
1652
+ "content": "<|reserved_special_token_198|>",
1653
+ "lstrip": false,
1654
+ "normalized": false,
1655
+ "rstrip": false,
1656
+ "single_word": false,
1657
+ "special": true
1658
+ },
1659
+ "128207": {
1660
+ "content": "<|reserved_special_token_199|>",
1661
+ "lstrip": false,
1662
+ "normalized": false,
1663
+ "rstrip": false,
1664
+ "single_word": false,
1665
+ "special": true
1666
+ },
1667
+ "128208": {
1668
+ "content": "<|reserved_special_token_200|>",
1669
+ "lstrip": false,
1670
+ "normalized": false,
1671
+ "rstrip": false,
1672
+ "single_word": false,
1673
+ "special": true
1674
+ },
1675
+ "128209": {
1676
+ "content": "<|reserved_special_token_201|>",
1677
+ "lstrip": false,
1678
+ "normalized": false,
1679
+ "rstrip": false,
1680
+ "single_word": false,
1681
+ "special": true
1682
+ },
1683
+ "128210": {
1684
+ "content": "<|reserved_special_token_202|>",
1685
+ "lstrip": false,
1686
+ "normalized": false,
1687
+ "rstrip": false,
1688
+ "single_word": false,
1689
+ "special": true
1690
+ },
1691
+ "128211": {
1692
+ "content": "<|reserved_special_token_203|>",
1693
+ "lstrip": false,
1694
+ "normalized": false,
1695
+ "rstrip": false,
1696
+ "single_word": false,
1697
+ "special": true
1698
+ },
1699
+ "128212": {
1700
+ "content": "<|reserved_special_token_204|>",
1701
+ "lstrip": false,
1702
+ "normalized": false,
1703
+ "rstrip": false,
1704
+ "single_word": false,
1705
+ "special": true
1706
+ },
1707
+ "128213": {
1708
+ "content": "<|reserved_special_token_205|>",
1709
+ "lstrip": false,
1710
+ "normalized": false,
1711
+ "rstrip": false,
1712
+ "single_word": false,
1713
+ "special": true
1714
+ },
1715
+ "128214": {
1716
+ "content": "<|reserved_special_token_206|>",
1717
+ "lstrip": false,
1718
+ "normalized": false,
1719
+ "rstrip": false,
1720
+ "single_word": false,
1721
+ "special": true
1722
+ },
1723
+ "128215": {
1724
+ "content": "<|reserved_special_token_207|>",
1725
+ "lstrip": false,
1726
+ "normalized": false,
1727
+ "rstrip": false,
1728
+ "single_word": false,
1729
+ "special": true
1730
+ },
1731
+ "128216": {
1732
+ "content": "<|reserved_special_token_208|>",
1733
+ "lstrip": false,
1734
+ "normalized": false,
1735
+ "rstrip": false,
1736
+ "single_word": false,
1737
+ "special": true
1738
+ },
1739
+ "128217": {
1740
+ "content": "<|reserved_special_token_209|>",
1741
+ "lstrip": false,
1742
+ "normalized": false,
1743
+ "rstrip": false,
1744
+ "single_word": false,
1745
+ "special": true
1746
+ },
1747
+ "128218": {
1748
+ "content": "<|reserved_special_token_210|>",
1749
+ "lstrip": false,
1750
+ "normalized": false,
1751
+ "rstrip": false,
1752
+ "single_word": false,
1753
+ "special": true
1754
+ },
1755
+ "128219": {
1756
+ "content": "<|reserved_special_token_211|>",
1757
+ "lstrip": false,
1758
+ "normalized": false,
1759
+ "rstrip": false,
1760
+ "single_word": false,
1761
+ "special": true
1762
+ },
1763
+ "128220": {
1764
+ "content": "<|reserved_special_token_212|>",
1765
+ "lstrip": false,
1766
+ "normalized": false,
1767
+ "rstrip": false,
1768
+ "single_word": false,
1769
+ "special": true
1770
+ },
1771
+ "128221": {
1772
+ "content": "<|reserved_special_token_213|>",
1773
+ "lstrip": false,
1774
+ "normalized": false,
1775
+ "rstrip": false,
1776
+ "single_word": false,
1777
+ "special": true
1778
+ },
1779
+ "128222": {
1780
+ "content": "<|reserved_special_token_214|>",
1781
+ "lstrip": false,
1782
+ "normalized": false,
1783
+ "rstrip": false,
1784
+ "single_word": false,
1785
+ "special": true
1786
+ },
1787
+ "128223": {
1788
+ "content": "<|reserved_special_token_215|>",
1789
+ "lstrip": false,
1790
+ "normalized": false,
1791
+ "rstrip": false,
1792
+ "single_word": false,
1793
+ "special": true
1794
+ },
1795
+ "128224": {
1796
+ "content": "<|reserved_special_token_216|>",
1797
+ "lstrip": false,
1798
+ "normalized": false,
1799
+ "rstrip": false,
1800
+ "single_word": false,
1801
+ "special": true
1802
+ },
1803
+ "128225": {
1804
+ "content": "<|reserved_special_token_217|>",
1805
+ "lstrip": false,
1806
+ "normalized": false,
1807
+ "rstrip": false,
1808
+ "single_word": false,
1809
+ "special": true
1810
+ },
1811
+ "128226": {
1812
+ "content": "<|reserved_special_token_218|>",
1813
+ "lstrip": false,
1814
+ "normalized": false,
1815
+ "rstrip": false,
1816
+ "single_word": false,
1817
+ "special": true
1818
+ },
1819
+ "128227": {
1820
+ "content": "<|reserved_special_token_219|>",
1821
+ "lstrip": false,
1822
+ "normalized": false,
1823
+ "rstrip": false,
1824
+ "single_word": false,
1825
+ "special": true
1826
+ },
1827
+ "128228": {
1828
+ "content": "<|reserved_special_token_220|>",
1829
+ "lstrip": false,
1830
+ "normalized": false,
1831
+ "rstrip": false,
1832
+ "single_word": false,
1833
+ "special": true
1834
+ },
1835
+ "128229": {
1836
+ "content": "<|reserved_special_token_221|>",
1837
+ "lstrip": false,
1838
+ "normalized": false,
1839
+ "rstrip": false,
1840
+ "single_word": false,
1841
+ "special": true
1842
+ },
1843
+ "128230": {
1844
+ "content": "<|reserved_special_token_222|>",
1845
+ "lstrip": false,
1846
+ "normalized": false,
1847
+ "rstrip": false,
1848
+ "single_word": false,
1849
+ "special": true
1850
+ },
1851
+ "128231": {
1852
+ "content": "<|reserved_special_token_223|>",
1853
+ "lstrip": false,
1854
+ "normalized": false,
1855
+ "rstrip": false,
1856
+ "single_word": false,
1857
+ "special": true
1858
+ },
1859
+ "128232": {
1860
+ "content": "<|reserved_special_token_224|>",
1861
+ "lstrip": false,
1862
+ "normalized": false,
1863
+ "rstrip": false,
1864
+ "single_word": false,
1865
+ "special": true
1866
+ },
1867
+ "128233": {
1868
+ "content": "<|reserved_special_token_225|>",
1869
+ "lstrip": false,
1870
+ "normalized": false,
1871
+ "rstrip": false,
1872
+ "single_word": false,
1873
+ "special": true
1874
+ },
1875
+ "128234": {
1876
+ "content": "<|reserved_special_token_226|>",
1877
+ "lstrip": false,
1878
+ "normalized": false,
1879
+ "rstrip": false,
1880
+ "single_word": false,
1881
+ "special": true
1882
+ },
1883
+ "128235": {
1884
+ "content": "<|reserved_special_token_227|>",
1885
+ "lstrip": false,
1886
+ "normalized": false,
1887
+ "rstrip": false,
1888
+ "single_word": false,
1889
+ "special": true
1890
+ },
1891
+ "128236": {
1892
+ "content": "<|reserved_special_token_228|>",
1893
+ "lstrip": false,
1894
+ "normalized": false,
1895
+ "rstrip": false,
1896
+ "single_word": false,
1897
+ "special": true
1898
+ },
1899
+ "128237": {
1900
+ "content": "<|reserved_special_token_229|>",
1901
+ "lstrip": false,
1902
+ "normalized": false,
1903
+ "rstrip": false,
1904
+ "single_word": false,
1905
+ "special": true
1906
+ },
1907
+ "128238": {
1908
+ "content": "<|reserved_special_token_230|>",
1909
+ "lstrip": false,
1910
+ "normalized": false,
1911
+ "rstrip": false,
1912
+ "single_word": false,
1913
+ "special": true
1914
+ },
1915
+ "128239": {
1916
+ "content": "<|reserved_special_token_231|>",
1917
+ "lstrip": false,
1918
+ "normalized": false,
1919
+ "rstrip": false,
1920
+ "single_word": false,
1921
+ "special": true
1922
+ },
1923
+ "128240": {
1924
+ "content": "<|reserved_special_token_232|>",
1925
+ "lstrip": false,
1926
+ "normalized": false,
1927
+ "rstrip": false,
1928
+ "single_word": false,
1929
+ "special": true
1930
+ },
1931
+ "128241": {
1932
+ "content": "<|reserved_special_token_233|>",
1933
+ "lstrip": false,
1934
+ "normalized": false,
1935
+ "rstrip": false,
1936
+ "single_word": false,
1937
+ "special": true
1938
+ },
1939
+ "128242": {
1940
+ "content": "<|reserved_special_token_234|>",
1941
+ "lstrip": false,
1942
+ "normalized": false,
1943
+ "rstrip": false,
1944
+ "single_word": false,
1945
+ "special": true
1946
+ },
1947
+ "128243": {
1948
+ "content": "<|reserved_special_token_235|>",
1949
+ "lstrip": false,
1950
+ "normalized": false,
1951
+ "rstrip": false,
1952
+ "single_word": false,
1953
+ "special": true
1954
+ },
1955
+ "128244": {
1956
+ "content": "<|reserved_special_token_236|>",
1957
+ "lstrip": false,
1958
+ "normalized": false,
1959
+ "rstrip": false,
1960
+ "single_word": false,
1961
+ "special": true
1962
+ },
1963
+ "128245": {
1964
+ "content": "<|reserved_special_token_237|>",
1965
+ "lstrip": false,
1966
+ "normalized": false,
1967
+ "rstrip": false,
1968
+ "single_word": false,
1969
+ "special": true
1970
+ },
1971
+ "128246": {
1972
+ "content": "<|reserved_special_token_238|>",
1973
+ "lstrip": false,
1974
+ "normalized": false,
1975
+ "rstrip": false,
1976
+ "single_word": false,
1977
+ "special": true
1978
+ },
1979
+ "128247": {
1980
+ "content": "<|reserved_special_token_239|>",
1981
+ "lstrip": false,
1982
+ "normalized": false,
1983
+ "rstrip": false,
1984
+ "single_word": false,
1985
+ "special": true
1986
+ },
1987
+ "128248": {
1988
+ "content": "<|reserved_special_token_240|>",
1989
+ "lstrip": false,
1990
+ "normalized": false,
1991
+ "rstrip": false,
1992
+ "single_word": false,
1993
+ "special": true
1994
+ },
1995
+ "128249": {
1996
+ "content": "<|reserved_special_token_241|>",
1997
+ "lstrip": false,
1998
+ "normalized": false,
1999
+ "rstrip": false,
2000
+ "single_word": false,
2001
+ "special": true
2002
+ },
2003
+ "128250": {
2004
+ "content": "<|reserved_special_token_242|>",
2005
+ "lstrip": false,
2006
+ "normalized": false,
2007
+ "rstrip": false,
2008
+ "single_word": false,
2009
+ "special": true
2010
+ },
2011
+ "128251": {
2012
+ "content": "<|reserved_special_token_243|>",
2013
+ "lstrip": false,
2014
+ "normalized": false,
2015
+ "rstrip": false,
2016
+ "single_word": false,
2017
+ "special": true
2018
+ },
2019
+ "128252": {
2020
+ "content": "<|reserved_special_token_244|>",
2021
+ "lstrip": false,
2022
+ "normalized": false,
2023
+ "rstrip": false,
2024
+ "single_word": false,
2025
+ "special": true
2026
+ },
2027
+ "128253": {
2028
+ "content": "<|reserved_special_token_245|>",
2029
+ "lstrip": false,
2030
+ "normalized": false,
2031
+ "rstrip": false,
2032
+ "single_word": false,
2033
+ "special": true
2034
+ },
2035
+ "128254": {
2036
+ "content": "<|reserved_special_token_246|>",
2037
+ "lstrip": false,
2038
+ "normalized": false,
2039
+ "rstrip": false,
2040
+ "single_word": false,
2041
+ "special": true
2042
+ },
2043
+ "128255": {
2044
+ "content": "<|reserved_special_token_247|>",
2045
+ "lstrip": false,
2046
+ "normalized": false,
2047
+ "rstrip": false,
2048
+ "single_word": false,
2049
+ "special": true
2050
+ }
2051
+ },
2052
+ "bos_token": "<|begin_of_text|>",
2053
+ "chat_template": "{{- bos_token }}\n{%- if custom_tools is defined %}\n {%- set tools = custom_tools %}\n{%- endif %}\n{%- if not tools_in_user_message is defined %}\n {%- set tools_in_user_message = true %}\n{%- endif %}\n{%- if not date_string is defined %}\n {%- set date_string = \"26 July 2024\" %}\n{%- endif %}\n{%- if not tools is defined %}\n {%- set tools = none %}\n{%- endif %}\n\n{#- This block extracts the system message, so we can slot it into the right place. #}\n{%- if messages[0]['role'] == 'system' %}\n {%- set system_message = messages[0]['content'] %}\n {%- set messages = messages[1:] %}\n{%- else %}\n {%- set system_message = \"\" %}\n{%- endif %}\n\n{#- System message + builtin tools #}\n{{- \"<|start_header_id|>system<|end_header_id|>\n\n\" }}\n{%- if builtin_tools is defined or tools is not none %}\n {{- \"Environment: ipython\n\" }}\n{%- endif %}\n{%- if builtin_tools is defined %}\n {{- \"Tools: \" + builtin_tools | reject('equalto', 'code_interpreter') | join(\", \") + \"\n\n\"}}\n{%- endif %}\n{{- \"Cutting Knowledge Date: December 2023\n\" }}\n{{- \"Today Date: \" + date_string + \"\n\n\" }}\n{%- if tools is not none and not tools_in_user_message %}\n {{- \"You have access to the following functions. To call a function, please respond with JSON for a function call.\" }}\n {{- 'Respond in the format {\"name\": function name, \"parameters\": dictionary of argument name and its value}.' }}\n {{- \"Do not use variables.\n\n\" }}\n {%- for t in tools %}\n {{- t | tojson(indent=4) }}\n {{- \"\n\n\" }}\n {%- endfor %}\n{%- endif %}\n{{- system_message }}\n{{- \"<|eot_id|>\" }}\n\n{#- Custom tools are passed in a user message with some extra guidance #}\n{%- if tools_in_user_message and not tools is none %}\n {#- Extract the first user message so we can plug it in here #}\n {%- if messages | length != 0 %}\n {%- set first_user_message = messages[0]['content'] %}\n {%- set messages = messages[1:] %}\n {%- else %}\n {{- raise_exception(\"Cannot put tools in the first user message when there's no first user message!\") }}\n{%- endif %}\n {{- '<|start_header_id|>user<|end_header_id|>\n\n' -}}\n {{- \"Given the following functions, please respond with a JSON for a function call \" }}\n {{- \"with its proper arguments that best answers the given prompt.\n\n\" }}\n {{- 'Respond in the format {\"name\": function name, \"parameters\": dictionary of argument name and its value}.' }}\n {{- \"Do not use variables.\n\n\" }}\n {%- for t in tools %}\n {{- t | tojson(indent=4) }}\n {{- \"\n\n\" }}\n {%- endfor %}\n {{- first_user_message + \"<|eot_id|>\"}}\n{%- endif %}\n\n{%- for message in messages %}\n {%- if not (message.role == 'ipython' or message.role == 'tool' or 'tool_calls' in message) %}\n {{- '<|start_header_id|>' + message['role'] + '<|end_header_id|>\n\n'+ message['content'] + '<|eot_id|>' }}\n {%- elif 'tool_calls' in message %}\n {%- if not message.tool_calls|length == 1 %}\n {{- raise_exception(\"This model only supports single tool-calls at once!\") }}\n {%- endif %}\n {%- set tool_call = message.tool_calls[0].function %}\n {%- if builtin_tools is defined and tool_call.name in builtin_tools %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\n\n' -}}\n {{- \"<|python_tag|>\" + tool_call.name + \".call(\" }}\n {%- for arg_name, arg_val in tool_call.arguments | items %}\n {{- arg_name + '=\"' + arg_val + '\"' }}\n {%- if not loop.last %}\n {{- \", \" }}\n {%- endif %}\n {%- endfor %}\n {{- \")\" }}\n {%- else %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\n\n' -}}\n {{- '{\"name\": \"' + tool_call.name + '\", ' }}\n {{- '\"parameters\": ' }}\n {{- tool_call.arguments | tojson }}\n {{- \"}\" }}\n {%- endif %}\n {%- if builtin_tools is defined %}\n {#- This means we're in ipython mode #}\n {{- \"<|eom_id|>\" }}\n {%- else %}\n {{- \"<|eot_id|>\" }}\n {%- endif %}\n {%- elif message.role == \"tool\" or message.role == \"ipython\" %}\n {{- \"<|start_header_id|>ipython<|end_header_id|>\n\n\" }}\n {%- if message.content is mapping or message.content is iterable %}\n {{- message.content | tojson }}\n {%- else %}\n {{- message.content }}\n {%- endif %}\n {{- \"<|eot_id|>\" }}\n {%- endif %}\n{%- endfor %}\n{%- if add_generation_prompt %}\n {{- '<|start_header_id|>assistant<|end_header_id|>\n\n' }}\n{%- endif %}\n",
2054
+ "clean_up_tokenization_spaces": true,
2055
+ "eos_token": "<|eot_id|>",
2056
+ "model_input_names": [
2057
+ "input_ids",
2058
+ "attention_mask"
2059
+ ],
2060
+ "model_max_length": 131072,
2061
+ "pad_token": "<|finetune_right_pad_id|>",
2062
+ "padding_side": "left",
2063
+ "tokenizer_class": "PreTrainedTokenizerFast"
2064
+ }