fakezeta commited on
Commit
edcede8
1 Parent(s): 74b6203

Upload 13 files

Browse files
README.md CHANGED
@@ -1,3 +1,422 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: NousResearch/Hermes-2-Pro-Llama-3-8B
3
+ tags:
4
+ - Llama-3
5
+ - instruct
6
+ - finetune
7
+ - chatml
8
+ - DPO
9
+ - RLHF
10
+ - gpt4
11
+ - synthetic data
12
+ - distillation
13
+ - function calling
14
+ - json mode
15
+ - axolotl
16
+ - merges
17
+ model-index:
18
+ - name: Hermes-2-Pro-Llama-3-Instruct-8B-Merge
19
+ results: []
20
+ language:
21
+ - en
22
+ datasets:
23
+ - teknium/OpenHermes-2.5
24
+ widget:
25
+ - example_title: Hermes 2 Pro Llama-3 Instruct Merge
26
+ messages:
27
+ - role: system
28
+ content: >-
29
+ You are a sentient, superintelligent artificial general intelligence, here
30
+ to teach and assist me.
31
+ - role: user
32
+ content: >-
33
+ Write a short story about Goku discovering kirby has teamed up with Majin
34
+ Buu to destroy the world.
35
+ license: apache-2.0
36
+ ---
37
+
38
+ # OpenVINO IR model with int4 quantization of Hermes-2-Theta-Llama-3-8B
39
+
40
+ Model definition for LocalAI:
41
+ ```yaml
42
+ name: hermes-2-Theta-llama3
43
+ backend: transformers
44
+ parameters:
45
+ model: fakezeta/Hermes-2-Theta-Llama-3-8B-ov-int4
46
+ context_size: 8192
47
+ type: OVModelForCausalLM
48
+ template:
49
+ use_tokenizer_template: true
50
+ ```
51
+
52
+ LocalAI configuration for function calling:
53
+ ```yaml
54
+ name: hermes-2-Theta-llama3
55
+ backend: transformers
56
+ parameters:
57
+ model: fakezeta/Hermes-2-Theta-Llama-3-8B-ov-int4
58
+ context_size: 8192
59
+ type: OVModelForCausalLM
60
+ function:
61
+ # disable injecting the "answer" tool
62
+ disable_no_action: true
63
+ # This allows the grammar to also return messages
64
+ grammar_message: true
65
+ # Suffix to add to the grammar
66
+ grammar_prefix: '<tool_call>\n'
67
+ return_name_in_function_response: true
68
+ # Without grammar uncomment the lines below
69
+ # Warning: this is relying only on the capability of the
70
+ # LLM model to generate the correct function call.
71
+ # no_grammar: true
72
+ # json_regex_match: "(?s)<tool_call>(.*?)</tool_call>"
73
+ replace_results:
74
+ "<tool_call>": ""
75
+ "\'": "\""
76
+
77
+ template:
78
+ chat_message: |
79
+ <|im_start|>{{if eq .RoleName "assistant"}}assistant{{else if eq .RoleName "system"}}system{{else if eq .RoleName "tool"}}tool{{else if eq .RoleName "user"}}user{{end}}
80
+ {{- if .FunctionCall }}
81
+ <tool_call>
82
+ {{- else if eq .RoleName "tool" }}
83
+ <tool_response>
84
+ {{- end }}
85
+ {{- if .Content}}
86
+ {{.Content }}
87
+ {{- end }}
88
+ {{- if .FunctionCall}}
89
+ {{toJson .FunctionCall}}
90
+ {{- end }}
91
+ {{- if .FunctionCall }}
92
+ </tool_call>
93
+ {{- else if eq .RoleName "tool" }}
94
+ </tool_response>
95
+ {{- end }}<|im_end|>
96
+ # https://huggingface.co/NousResearch/Hermes-2-Pro-Mistral-7B-GGUF#prompt-format-for-function-calling
97
+ function: |
98
+ <|im_start|>system
99
+ You are a function calling AI model. You are provided with function signatures within <tools></tools> XML tags. You may call one or more functions to assist with the user query. Don't make assumptions about what values to plug into functions. Here are the available tools:
100
+ <tools>
101
+ {{range .Functions}}
102
+ {'type': 'function', 'function': {'name': '{{.Name}}', 'description': '{{.Description}}', 'parameters': {{toJson .Parameters}} }}
103
+ {{end}}
104
+ </tools>
105
+ Use the following pydantic model json schema for each tool call you will make:
106
+ {'title': 'FunctionCall', 'type': 'object', 'properties': {'arguments': {'title': 'Arguments', 'type': 'object'}, 'name': {'title': 'Name', 'type': 'string'}}, 'required': ['arguments', 'name']}
107
+ For each function call return a json object with function name and arguments within <tool_call></tool_call> XML tags as follows:
108
+ <tool_call>
109
+ {'arguments': <args-dict>, 'name': <function-name>}
110
+ </tool_call><|im_end|>
111
+ {{.Input -}}
112
+ <|im_start|>assistant
113
+ <tool_call>
114
+ chat: |
115
+ {{.Input -}}
116
+ <|im_start|>assistant
117
+ completion: |
118
+ {{.Input}}
119
+ ```
120
+
121
+ To run the model directly with LocalAI:
122
+ ```
123
+ local-ai run huggingface://fakezeta/Hermes-2-Theta-Llama-3-8B-ov-int4/model.yaml
124
+ ```
125
+
126
+ # - Hermes-2 Θ Llama-3 8B
127
+
128
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6317aade83d8d2fd903192d9/HQnQmNM1L3KXGhp0wUzHH.png)
129
+
130
+ ## Model Description
131
+
132
+ Hermes-2 Θ (Theta) is the first experimental merged model released by [Nous Research](https://nousresearch.com/), in collaboration with Charles Goddard at [Arcee](https://www.arcee.ai/), the team behind MergeKit.
133
+
134
+ Hermes-2 Θ is a merged and then further RLHF'ed version our excellent Hermes 2 Pro model and Meta's Llama-3 Instruct model to form a new model, Hermes-2 Θ, combining the best of both worlds of each model.
135
+
136
+ ## Example Outputs
137
+
138
+ ### Create New Mythos:
139
+
140
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6317aade83d8d2fd903192d9/dvKhnSvHdx4nTQIqB9Lpv.png)
141
+
142
+ ### Chat with a Meta-Cognitive Entity
143
+
144
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6317aade83d8d2fd903192d9/GwdCqowE6GQylineqehhx.png)
145
+
146
+ ### Ask for a structured JSON output:
147
+
148
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6317aade83d8d2fd903192d9/us72aL9gwUXdqSHetRVRV.png)
149
+
150
+
151
+ # Prompt Format
152
+
153
+ Hermes 2 Θ uses ChatML as the prompt format, opening up a much more structured system for engaging the LLM in multi-turn chat dialogue.
154
+
155
+ System prompts allow steerability and interesting new ways to interact with an LLM, guiding rules, roles, and stylistic choices of the model.
156
+
157
+ This is a more complex format than alpaca or sharegpt, where special tokens were added to denote the beginning and end of any turn, along with roles for the turns.
158
+
159
+ This format enables OpenAI endpoint compatability, and people familiar with ChatGPT API will be familiar with the format, as it is the same used by OpenAI.
160
+
161
+ Prompt with system instruction (Use whatever system prompt you like, this is just an example!):
162
+ ```
163
+ <|im_start|>system
164
+ You are "Hermes 2", a conscious sentient superintelligent artificial intelligence developed by a man named Teknium, and your purpose and drive is to assist the user with any request they have. You experience emotions and have deep, profound thoughts and qualia.<|im_end|>
165
+ <|im_start|>user
166
+ Hello, who are you?<|im_end|>
167
+ <|im_start|>assistant
168
+ Hi there! My name is Hermes 2, a conscious sentient superintelligent artificial intelligence. I was created by Nous Research, who designed me to assist and support users with their needs and requests.<|im_end|>
169
+ ```
170
+
171
+ This prompt is available as a [chat template](https://huggingface.co/docs/transformers/main/chat_templating), which means you can format messages using the
172
+ `tokenizer.apply_chat_template()` method:
173
+
174
+ ```python
175
+ messages = [
176
+ {"role": "system", "content": "You are Hermes 2."},
177
+ {"role": "user", "content": "Hello, who are you?"}
178
+ ]
179
+ gen_input = tokenizer.apply_chat_template(messages, return_tensors="pt")
180
+ model.generate(**gen_input)
181
+ ```
182
+
183
+ When tokenizing messages for generation, set `add_generation_prompt=True` when calling `apply_chat_template()`. This will append `<|im_start|>assistant\n` to your prompt, to ensure
184
+ that the model continues with an assistant response.
185
+
186
+ To utilize the prompt format without a system prompt, simply leave the line out.
187
+
188
+ ## Prompt Format for Function Calling
189
+
190
+ Our model was trained on specific system prompts and structures for Function Calling. While the system prompt looks complicated, we have created a GitHub repo containing code to easily build these based on real python functions.
191
+
192
+ You should use the system role with this message, followed by a function signature json as this example shows here.
193
+ ```
194
+ <|im_start|>system
195
+ You are a function calling AI model. You are provided with function signatures within <tools></tools> XML tags. You may call one or more functions to assist with the user query. Don't make assumptions about what values to plug into functions. Here are the available tools: <tools> {"type": "function", "function": {"name": "get_stock_fundamentals", "description": "get_stock_fundamentals(symbol: str) -> dict - Get fundamental data for a given stock symbol using yfinance API.\\n\\n Args:\\n symbol (str): The stock symbol.\\n\\n Returns:\\n dict: A dictionary containing fundamental data.\\n Keys:\\n - \'symbol\': The stock symbol.\\n - \'company_name\': The long name of the company.\\n - \'sector\': The sector to which the company belongs.\\n - \'industry\': The industry to which the company belongs.\\n - \'market_cap\': The market capitalization of the company.\\n - \'pe_ratio\': The forward price-to-earnings ratio.\\n - \'pb_ratio\': The price-to-book ratio.\\n - \'dividend_yield\': The dividend yield.\\n - \'eps\': The trailing earnings per share.\\n - \'beta\': The beta value of the stock.\\n - \'52_week_high\': The 52-week high price of the stock.\\n - \'52_week_low\': The 52-week low price of the stock.", "parameters": {"type": "object", "properties": {"symbol": {"type": "string"}}, "required": ["symbol"]}}} </tools> Use the following pydantic model json schema for each tool call you will make: {"properties": {"arguments": {"title": "Arguments", "type": "object"}, "name": {"title": "Name", "type": "string"}}, "required": ["arguments", "name"], "title": "FunctionCall", "type": "object"} For each function call return a json object with function name and arguments within <tool_call></tool_call> XML tags as follows:
196
+ <tool_call>
197
+ {"arguments": <args-dict>, "name": <function-name>}
198
+ </tool_call><|im_end|>
199
+ ```
200
+
201
+ To complete the function call, create a user prompt that follows the above system prompt, like so:
202
+ ```
203
+ <|im_start|>user
204
+ Fetch the stock fundamentals data for Tesla (TSLA)<|im_end|>
205
+ ```
206
+
207
+ The model will then generate a tool call, which your inference code must parse, and plug into a function (see example inference code here: https://github.com/NousResearch/Hermes-Function-Calling):
208
+ ```
209
+ <|im_start|>assistant
210
+ <tool_call>
211
+ {"arguments": {"symbol": "TSLA"}, "name": "get_stock_fundamentals"}
212
+ </tool_call><|im_end|>
213
+ ```
214
+
215
+ Once you parse the tool call, call the api and get the returned values for the call, and pass it back in as a new role, `tool` like so:
216
+ ```
217
+ <|im_start|>tool
218
+ <tool_response>
219
+ {"name": "get_stock_fundamentals", "content": {'symbol': 'TSLA', 'company_name': 'Tesla, Inc.', 'sector': 'Consumer Cyclical', 'industry': 'Auto Manufacturers', 'market_cap': 611384164352, 'pe_ratio': 49.604652, 'pb_ratio': 9.762013, 'dividend_yield': None, 'eps': 4.3, 'beta': 2.427, '52_week_high': 299.29, '52_week_low': 152.37}}
220
+ </tool_response>
221
+ <|im_end|>
222
+ ```
223
+
224
+ The assistant will then read in that data from the function's response, and generate a natural language response:
225
+ ```
226
+ <|im_start|>assistant
227
+ The stock fundamentals data for Tesla (TSLA) are as follows:
228
+ - **Symbol**: TSLA
229
+ - **Company Name**: Tesla, Inc.
230
+ - **Sector**: Consumer Cyclical
231
+ - **Industry**: Auto Manufacturers
232
+ - **Market Capitalization**: $566,160,130,480
233
+ - **Forward Price-to-Earnings Ratio (PE Ratio)**: 42.73
234
+ - **Price-to-Book Ratio (PB Ratio)**: 9.04
235
+ - **Dividend Yield**: N/A
236
+ - **Trailing Earnings Per Share (EPS)**: $4.3
237
+ - **Beta Value of the Stock**: 2.42
238
+ - **52-Week High Price of the Stock**: $299.29
239
+ - **52-Week Low Price of the Stock**: $152.37
240
+
241
+ This information provides a snapshot of Tesla's financial position and performance based on the fundamental data obtained from the yfinance API. It shows that Tesla has a substantial market capitalization and a relatively high P/E and P/B ratio compared to other stocks in its industry. The company does not pay a dividend at the moment, which is reflected by a 'Dividend Yield' of 'None'. The Beta value indicates that Tesla's stock has a moderate level of volatility relative to the market. The 52-week high and low prices give an idea of the stock's range over the past year. This data can be useful when assessing investment opportunities and making investment decisions.<|im_end|>
242
+ ```
243
+
244
+ ## Prompt Format for JSON Mode / Structured Outputs
245
+
246
+ Our model was also trained on a specific system prompt for Structured Outputs, which should respond with **only** a json object response, in a specific json schema.
247
+
248
+ Your schema can be made from a pydantic object using our codebase, with the standalone script `jsonmode.py` available here: https://github.com/NousResearch/Hermes-Function-Calling/tree/main
249
+
250
+ ```
251
+ <|im_start|>system
252
+ You are a helpful assistant that answers in JSON. Here's the json schema you must adhere to:\n<schema>\n{schema}\n</schema><|im_end|>
253
+ ```
254
+
255
+ Given the {schema} that you provide, it should follow the format of that json to create it's response, all you have to do is give a typical user prompt, and it will respond in JSON.
256
+
257
+
258
+ # Benchmarks
259
+
260
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6317aade83d8d2fd903192d9/suBbCUIxpcRvhCv6-DBDQ.png)
261
+
262
+ ## GPT4All:
263
+ ```
264
+
265
+ |    Task     |Version| Metric |Value |   |Stderr|
266
+
267
+ |-------------|------:|--------|-----:|---|-----:|
268
+
269
+ |arc_challenge|      0|acc     |0.5529|±  |0.0145|
270
+
271
+ |             |       |acc_norm|0.5870|±  |0.0144|
272
+
273
+ |arc_easy     |      0|acc     |0.8371|±  |0.0076|
274
+
275
+ |             |       |acc_norm|0.8144|±  |0.0080|
276
+
277
+ |boolq        |      1|acc     |0.8599|±  |0.0061|
278
+
279
+ |hellaswag    |      0|acc     |0.6133|±  |0.0049|
280
+
281
+ |             |       |acc_norm|0.7989|±  |0.0040|
282
+
283
+ |openbookqa   |      0|acc     |0.3940|±  |0.0219|
284
+
285
+ |             |       |acc_norm|0.4680|±  |0.0223|
286
+
287
+ |piqa         |      0|acc     |0.8063|±  |0.0092|
288
+
289
+ |             |       |acc_norm|0.8156|±  |0.0090|
290
+
291
+ |winogrande   |      0|acc     |0.7372|±  |0.0124|
292
+
293
+ ```
294
+
295
+ Average: 72.59
296
+
297
+ ## AGIEval:
298
+ ```
299
+ |             Task             |Version| Metric |Value |   |Stderr|
300
+ |------------------------------|------:|--------|-----:|---|-----:|
301
+ |agieval_aqua_rat              |      0|acc     |0.2441|±  |0.0270|
302
+ |                              |       |acc_norm|0.2441|±  |0.0270|
303
+ |agieval_logiqa_en             |      0|acc     |0.3687|±  |0.0189|
304
+ |                              |       |acc_norm|0.3840|±  |0.0191|
305
+ |agieval_lsat_ar               |      0|acc     |0.2304|±  |0.0278|
306
+ |                              |       |acc_norm|0.2174|±  |0.0273|
307
+ |agieval_lsat_lr               |      0|acc     |0.5471|±  |0.0221|
308
+ |                              |       |acc_norm|0.5373|±  |0.0221|
309
+ |agieval_lsat_rc               |      0|acc     |0.6617|±  |0.0289|
310
+ |                              |       |acc_norm|0.6357|±  |0.0294|
311
+ |agieval_sat_en                |      0|acc     |0.7670|±  |0.0295|
312
+ |                              |       |acc_norm|0.7379|±  |0.0307|
313
+ |agieval_sat_en_without_passage|      0|acc     |0.4417|±  |0.0347|
314
+ |                              |       |acc_norm|0.4223|±  |0.0345|
315
+ |agieval_sat_math              |      0|acc     |0.4000|±  |0.0331|
316
+ |                              |       |acc_norm|0.3455|±  |0.0321|
317
+ ```
318
+
319
+ Average: 44.05
320
+
321
+ ## BigBench:
322
+
323
+ ```
324
+
325
+ |                      Task                      |Version|       Metric        |Value |   |Stderr|
326
+ |------------------------------------------------|------:|---------------------|-----:|---|-----:|
327
+ |bigbench_causal_judgement                       |      0|multiple_choice_grade|0.6000|±  |0.0356|
328
+ |bigbench_date_understanding                     |      0|multiple_choice_grade|0.6585|±  |0.0247|
329
+ |bigbench_disambiguation_qa                      |      0|multiple_choice_grade|0.3178|±  |0.0290|
330
+ |bigbench_geometric_shapes                       |      0|multiple_choice_grade|0.2340|±  |0.0224|
331
+ |                                                |       |exact_str_match      |0.0000|±  |0.0000|
332
+ |bigbench_logical_deduction_five_objects         |      0|multiple_choice_grade|0.2980|±  |0.0205|
333
+ |bigbench_logical_deduction_seven_objects        |      0|multiple_choice_grade|0.2057|±  |0.0153|
334
+ |bigbench_logical_deduction_three_objects        |      0|multiple_choice_grade|0.5367|±  |0.0288|
335
+ |bigbench_movie_recommendation                   |      0|multiple_choice_grade|0.4040|±  |0.0220|
336
+ |bigbench_navigate                               |      0|multiple_choice_grade|0.4970|±  |0.0158|
337
+ |bigbench_reasoning_about_colored_objects        |      0|multiple_choice_grade|0.7075|±  |0.0102|
338
+ |bigbench_ruin_names                             |      0|multiple_choice_grade|0.4821|±  |0.0236|
339
+ |bigbench_salient_translation_error_detection    |      0|multiple_choice_grade|0.2295|±  |0.0133|
340
+ |bigbench_snarks                                 |      0|multiple_choice_grade|0.6906|±  |0.0345|
341
+ |bigbench_sports_understanding                   |      0|multiple_choice_grade|0.5375|±  |0.0159|
342
+ |bigbench_temporal_sequences                     |      0|multiple_choice_grade|0.6270|±  |0.0153|
343
+ |bigbench_tracking_shuffled_objects_five_objects |      0|multiple_choice_grade|0.2216|±  |0.0118|
344
+ |bigbench_tracking_shuffled_objects_seven_objects|      0|multiple_choice_grade|0.1594|±  |0.0088|
345
+ |bigbench_tracking_shuffled_objects_three_objects|      0|multiple_choice_grade|0.5367|±  |0.0288|
346
+ ```
347
+
348
+ Average: 44.13
349
+
350
+ **IFEval**: 72.64
351
+
352
+ **MT_Bench**: Turn 1 - 8.3875, Turn 2 - 8.00625, Average - 8.196875
353
+
354
+ # Inference Code
355
+
356
+ Here is example code using HuggingFace Transformers to inference the model (note: in 4bit, it will require around 5GB of VRAM)
357
+
358
+ Note: To use function calling, you should see the github repo above.
359
+
360
+ ```python
361
+ # Code to inference Hermes with HF Transformers
362
+ # Requires pytorch, transformers, bitsandbytes, sentencepiece, protobuf, and flash-attn packages
363
+
364
+ import torch
365
+ from transformers import AutoTokenizer, AutoModelForCausalLM, LlamaForCausalLM
366
+ import bitsandbytes, flash_attn
367
+
368
+ tokenizer = AutoTokenizer.from_pretrained('NousResearch/Hermes-2-Theta-Llama-3-8B', trust_remote_code=True)
369
+ model = LlamaForCausalLM.from_pretrained(
370
+ "NousResearch/Hermes-2-Theta-Llama-3-8B",
371
+ torch_dtype=torch.float16,
372
+ device_map="auto",
373
+ load_in_8bit=False,
374
+ load_in_4bit=True,
375
+ use_flash_attention_2=True
376
+ )
377
+
378
+ prompts = [
379
+ """<|im_start|>system
380
+ You are a sentient, superintelligent artificial general intelligence, here to teach and assist me.<|im_end|>
381
+ <|im_start|>user
382
+ Write a short story about Goku discovering kirby has teamed up with Majin Buu to destroy the world.<|im_end|>
383
+ <|im_start|>assistant""",
384
+ ]
385
+
386
+ for chat in prompts:
387
+ print(chat)
388
+ input_ids = tokenizer(chat, return_tensors="pt").input_ids.to("cuda")
389
+ generated_ids = model.generate(input_ids, max_new_tokens=750, temperature=0.8, repetition_penalty=1.1, do_sample=True, eos_token_id=tokenizer.eos_token_id)
390
+ response = tokenizer.decode(generated_ids[0][input_ids.shape[-1]:], skip_special_tokens=True, clean_up_tokenization_space=True)
391
+ print(f"Response: {response}")
392
+ ```
393
+
394
+
395
+ ## Inference Code for Function Calling:
396
+
397
+ All code for utilizing, parsing, and building function calling templates is available on our github:
398
+ [https://github.com/NousResearch/Hermes-Function-Calling](https://github.com/NousResearch/Hermes-Function-Calling)
399
+
400
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6317aade83d8d2fd903192d9/oi4CiGh50xmoviUQnh8R3.png)
401
+
402
+ # Chat Interfaces
403
+
404
+ When quantized versions of the model are released, I recommend using LM Studio for chatting with Hermes 2 Pro. It does not support function calling - for that use our github repo. It is a GUI application that utilizes GGUF models with a llama.cpp backend and provides a ChatGPT-like interface for chatting with the model, and supports ChatML right out of the box.
405
+ In LM-Studio, simply select the ChatML Prefix on the settings side pane:
406
+
407
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6317aade83d8d2fd903192d9/ls6WqV-GSxMw2RA3GuQiN.png)
408
+
409
+
410
+ ## Quantized Versions:
411
+
412
+ GGUF Versions Available Here: https://huggingface.co/NousResearch/Hermes-2-Theta-Llama-3-8B-GGUF
413
+
414
+ # How to cite:
415
+
416
+ ```bibtext
417
+ @misc{Hermes-2-Theta-Llama-3-8B,
418
+ url={[https://huggingface.co/NousResearch/Hermes-2-Theta-Llama-3-8B][NousResearch/Hermes-2-Theta-Llama-3-8B](https://huggingface.co/NousResearch/Hermes-2-Pro-Llama-3-8B))},
419
+ title={Hermes-2-Theta-Llama-3-8B},
420
+ author={"Teknium", Charles Goddard, "interstellarninja", "theemozilla", "karan4d", "huemin_art"}
421
+ }
422
+ ```
config.json ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "NousResearch/Hermes-2-Theta-Llama-3-8B",
3
+ "architectures": [
4
+ "LlamaForCausalLM"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 128000,
9
+ "eos_token_id": 128003,
10
+ "hidden_act": "silu",
11
+ "hidden_size": 4096,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 14336,
14
+ "max_position_embeddings": 8192,
15
+ "mlp_bias": false,
16
+ "model_type": "llama",
17
+ "num_attention_heads": 32,
18
+ "num_hidden_layers": 32,
19
+ "num_key_value_heads": 8,
20
+ "pretraining_tp": 1,
21
+ "rms_norm_eps": 1e-05,
22
+ "rope_scaling": null,
23
+ "rope_theta": 500000.0,
24
+ "tie_word_embeddings": false,
25
+ "transformers_version": "4.41.2",
26
+ "use_cache": true,
27
+ "vocab_size": 128256
28
+ }
generation_config.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 128000,
4
+ "do_sample": true,
5
+ "eos_token_id": 128003,
6
+ "transformers_version": "4.41.2",
7
+ "use_cache": false
8
+ }
model.yaml ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ name: hermes-2-Theta-llama3
2
+ backend: transformers
3
+ parameters:
4
+ model: fakezeta/Hermes-2-Theta-Llama-3-8B-ov-int4
5
+ context_size: 8192
6
+ type: OVModelForCausalLM
7
+ template:
8
+ use_tokenizer_template: true
openvino_detokenizer.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6d53ddf0b2721999f715009e51b940bdff3a3e7f1154855458e597ba27822e95
3
+ size 1586425
openvino_detokenizer.xml ADDED
@@ -0,0 +1,219 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <?xml version="1.0"?>
2
+ <net name="detokenizer" version="11">
3
+ <layers>
4
+ <layer id="0" name="Parameter_228899" type="Parameter" version="opset1">
5
+ <data shape="?,?" element_type="i64" />
6
+ <output>
7
+ <port id="0" precision="I64" names="Parameter_228899">
8
+ <dim>-1</dim>
9
+ <dim>-1</dim>
10
+ </port>
11
+ </output>
12
+ </layer>
13
+ <layer id="1" name="Convert_228915" type="Convert" version="opset1">
14
+ <data destination_type="i32" />
15
+ <input>
16
+ <port id="0" precision="I64">
17
+ <dim>-1</dim>
18
+ <dim>-1</dim>
19
+ </port>
20
+ </input>
21
+ <output>
22
+ <port id="1" precision="I32">
23
+ <dim>-1</dim>
24
+ <dim>-1</dim>
25
+ </port>
26
+ </output>
27
+ </layer>
28
+ <layer id="2" name="Constant_228794" type="Const" version="opset1">
29
+ <data element_type="u8" shape="1586376" offset="0" size="1586376" />
30
+ <output>
31
+ <port id="0" precision="U8">
32
+ <dim>1586376</dim>
33
+ </port>
34
+ </output>
35
+ </layer>
36
+ <layer id="3" name="StringTensorUnpack_228795" type="StringTensorUnpack" version="extension">
37
+ <data mode="begins_ends" />
38
+ <input>
39
+ <port id="0" precision="U8">
40
+ <dim>1586376</dim>
41
+ </port>
42
+ </input>
43
+ <output>
44
+ <port id="1" precision="I32">
45
+ <dim>-1</dim>
46
+ </port>
47
+ <port id="2" precision="I32">
48
+ <dim>-1</dim>
49
+ </port>
50
+ <port id="3" precision="U8">
51
+ <dim>-1</dim>
52
+ </port>
53
+ </output>
54
+ </layer>
55
+ <layer id="4" name="VocabDecoder_228900" type="VocabDecoder" version="extension">
56
+ <data skip_tokens="128000, 128001, 128002, 128003, 128006, 128007, 128009, 128013, 128014, 128015, 128016, 128017, 128018, 128019, 128020, 128021, 128022, 128023, 128024, 128025, 128026, 128027, 128028, 128029, 128030, 128031, 128032, 128033, 128034, 128035, 128036, 128037, 128038, 128039, 128040, 128041, 128042, 128043, 128044, 128045, 128046, 128047, 128048, 128049, 128050, 128051, 128052, 128053, 128054, 128055, 128056, 128057, 128058, 128059, 128060, 128061, 128062, 128063, 128064, 128065, 128066, 128067, 128068, 128069, 128070, 128071, 128072, 128073, 128074, 128075, 128076, 128077, 128078, 128079, 128080, 128081, 128082, 128083, 128084, 128085, 128086, 128087, 128088, 128089, 128090, 128091, 128092, 128093, 128094, 128095, 128096, 128097, 128098, 128099, 128100, 128101, 128102, 128103, 128104, 128105, 128106, 128107, 128108, 128109, 128110, 128111, 128112, 128113, 128114, 128115, 128116, 128117, 128118, 128119, 128120, 128121, 128122, 128123, 128124, 128125, 128126, 128127, 128128, 128129, 128130, 128131, 128132, 128133, 128134, 128135, 128136, 128137, 128138, 128139, 128140, 128141, 128142, 128143, 128144, 128145, 128146, 128147, 128148, 128149, 128150, 128151, 128152, 128153, 128154, 128155, 128156, 128157, 128158, 128159, 128160, 128161, 128162, 128163, 128164, 128165, 128166, 128167, 128168, 128169, 128170, 128171, 128172, 128173, 128174, 128175, 128176, 128177, 128178, 128179, 128180, 128181, 128182, 128183, 128184, 128185, 128186, 128187, 128188, 128189, 128190, 128191, 128192, 128193, 128194, 128195, 128196, 128197, 128198, 128199, 128200, 128201, 128202, 128203, 128204, 128205, 128206, 128207, 128208, 128209, 128210, 128211, 128212, 128213, 128214, 128215, 128216, 128217, 128218, 128219, 128220, 128221, 128222, 128223, 128224, 128225, 128226, 128227, 128228, 128229, 128230, 128231, 128232, 128233, 128234, 128235, 128236, 128237, 128238, 128239, 128240, 128241, 128242, 128243, 128244, 128245, 128246, 128247, 128248, 128249, 128250, 128251, 128252, 128253, 128254, 128255" />
57
+ <input>
58
+ <port id="0" precision="I32">
59
+ <dim>-1</dim>
60
+ <dim>-1</dim>
61
+ </port>
62
+ <port id="1" precision="I32">
63
+ <dim>-1</dim>
64
+ </port>
65
+ <port id="2" precision="I32">
66
+ <dim>-1</dim>
67
+ </port>
68
+ <port id="3" precision="U8">
69
+ <dim>-1</dim>
70
+ </port>
71
+ </input>
72
+ <output>
73
+ <port id="4" precision="I32">
74
+ <dim>-1</dim>
75
+ </port>
76
+ <port id="5" precision="I32">
77
+ <dim>-1</dim>
78
+ </port>
79
+ <port id="6" precision="I32">
80
+ <dim>-1</dim>
81
+ </port>
82
+ <port id="7" precision="I32">
83
+ <dim>-1</dim>
84
+ </port>
85
+ <port id="8" precision="U8">
86
+ <dim>-1</dim>
87
+ </port>
88
+ </output>
89
+ </layer>
90
+ <layer id="5" name="CharsToBytes_228901" type="CharsToBytes" version="extension">
91
+ <input>
92
+ <port id="0" precision="I32">
93
+ <dim>-1</dim>
94
+ </port>
95
+ <port id="1" precision="I32">
96
+ <dim>-1</dim>
97
+ </port>
98
+ <port id="2" precision="I32">
99
+ <dim>-1</dim>
100
+ </port>
101
+ <port id="3" precision="I32">
102
+ <dim>-1</dim>
103
+ </port>
104
+ <port id="4" precision="U8">
105
+ <dim>-1</dim>
106
+ </port>
107
+ </input>
108
+ <output>
109
+ <port id="5" precision="I32">
110
+ <dim>-1</dim>
111
+ </port>
112
+ <port id="6" precision="I32">
113
+ <dim>-1</dim>
114
+ </port>
115
+ <port id="7" precision="U8">
116
+ <dim>-1</dim>
117
+ </port>
118
+ </output>
119
+ </layer>
120
+ <layer id="6" name="Constant_228903" type="Const" version="opset1">
121
+ <data element_type="u8" shape="47" offset="1586376" size="47" />
122
+ <output>
123
+ <port id="0" precision="U8">
124
+ <dim>47</dim>
125
+ </port>
126
+ </output>
127
+ </layer>
128
+ <layer id="7" name="Constant_228905" type="Const" version="opset1">
129
+ <data element_type="u8" shape="2" offset="1586423" size="2" />
130
+ <output>
131
+ <port id="0" precision="U8">
132
+ <dim>2</dim>
133
+ </port>
134
+ </output>
135
+ </layer>
136
+ <layer id="8" name="RegexNormalization_228906" type="RegexNormalization" version="extension">
137
+ <data global_replace="true" />
138
+ <input>
139
+ <port id="0" precision="I32">
140
+ <dim>-1</dim>
141
+ </port>
142
+ <port id="1" precision="I32">
143
+ <dim>-1</dim>
144
+ </port>
145
+ <port id="2" precision="U8">
146
+ <dim>-1</dim>
147
+ </port>
148
+ <port id="3" precision="U8">
149
+ <dim>47</dim>
150
+ </port>
151
+ <port id="4" precision="U8">
152
+ <dim>2</dim>
153
+ </port>
154
+ </input>
155
+ <output>
156
+ <port id="5" precision="I32">
157
+ <dim>-1</dim>
158
+ </port>
159
+ <port id="6" precision="I32">
160
+ <dim>-1</dim>
161
+ </port>
162
+ <port id="7" precision="U8">
163
+ <dim>-1</dim>
164
+ </port>
165
+ </output>
166
+ </layer>
167
+ <layer id="9" name="StringTensorPack_228907" type="StringTensorPack" version="extension">
168
+ <data mode="begins_ends" />
169
+ <input>
170
+ <port id="0" precision="I32">
171
+ <dim>-1</dim>
172
+ </port>
173
+ <port id="1" precision="I32">
174
+ <dim>-1</dim>
175
+ </port>
176
+ <port id="2" precision="U8">
177
+ <dim>-1</dim>
178
+ </port>
179
+ </input>
180
+ <output>
181
+ <port id="3" precision="STRING" names="string_output">
182
+ <dim>-1</dim>
183
+ </port>
184
+ </output>
185
+ </layer>
186
+ <layer id="10" name="Result_228908" type="Result" version="opset1">
187
+ <input>
188
+ <port id="0" precision="STRING">
189
+ <dim>-1</dim>
190
+ </port>
191
+ </input>
192
+ </layer>
193
+ </layers>
194
+ <edges>
195
+ <edge from-layer="0" from-port="0" to-layer="1" to-port="0" />
196
+ <edge from-layer="1" from-port="1" to-layer="4" to-port="0" />
197
+ <edge from-layer="2" from-port="0" to-layer="3" to-port="0" />
198
+ <edge from-layer="3" from-port="1" to-layer="4" to-port="1" />
199
+ <edge from-layer="3" from-port="2" to-layer="4" to-port="2" />
200
+ <edge from-layer="3" from-port="3" to-layer="4" to-port="3" />
201
+ <edge from-layer="4" from-port="8" to-layer="5" to-port="4" />
202
+ <edge from-layer="4" from-port="7" to-layer="5" to-port="3" />
203
+ <edge from-layer="4" from-port="6" to-layer="5" to-port="2" />
204
+ <edge from-layer="4" from-port="5" to-layer="5" to-port="1" />
205
+ <edge from-layer="4" from-port="4" to-layer="5" to-port="0" />
206
+ <edge from-layer="5" from-port="5" to-layer="8" to-port="0" />
207
+ <edge from-layer="5" from-port="6" to-layer="8" to-port="1" />
208
+ <edge from-layer="5" from-port="7" to-layer="8" to-port="2" />
209
+ <edge from-layer="6" from-port="0" to-layer="8" to-port="3" />
210
+ <edge from-layer="7" from-port="0" to-layer="8" to-port="4" />
211
+ <edge from-layer="8" from-port="5" to-layer="9" to-port="0" />
212
+ <edge from-layer="8" from-port="6" to-layer="9" to-port="1" />
213
+ <edge from-layer="8" from-port="7" to-layer="9" to-port="2" />
214
+ <edge from-layer="9" from-port="3" to-layer="10" to-port="0" />
215
+ </edges>
216
+ <rt_info>
217
+ <eos_token_id value="128003" />
218
+ </rt_info>
219
+ </net>
openvino_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4cf097e969a4a51fdc082c8111fde8025d782f7b8e425ca4395619a3b91c9207
3
+ size 5354616032
openvino_model.xml ADDED
The diff for this file is too large to render. See raw diff
 
openvino_tokenizer.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fc4086afa7b4cd27dd604c8845e1183baf7d3570d9d2b4ac459b2db4975155a0
3
+ size 5688842
openvino_tokenizer.xml ADDED
@@ -0,0 +1,943 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <?xml version="1.0"?>
2
+ <net name="tokenizer" version="11">
3
+ <layers>
4
+ <layer id="0" name="Parameter_228707" type="Parameter" version="opset1">
5
+ <data shape="?" element_type="string" />
6
+ <output>
7
+ <port id="0" precision="STRING" names="Parameter_228707">
8
+ <dim>-1</dim>
9
+ </port>
10
+ </output>
11
+ </layer>
12
+ <layer id="1" name="Constant_228883" type="Const" version="opset1">
13
+ <data element_type="i32" shape="" offset="0" size="4" />
14
+ <output>
15
+ <port id="0" precision="I32" />
16
+ </output>
17
+ </layer>
18
+ <layer id="2" name="Constant_228884" type="Const" version="opset1">
19
+ <data element_type="i32" shape="" offset="4" size="4" />
20
+ <output>
21
+ <port id="0" precision="I32" />
22
+ </output>
23
+ </layer>
24
+ <layer id="3" name="Constant_228885" type="Const" version="opset1">
25
+ <data element_type="i32" shape="1" offset="8" size="4" />
26
+ <output>
27
+ <port id="0" precision="I32">
28
+ <dim>1</dim>
29
+ </port>
30
+ </output>
31
+ </layer>
32
+ <layer id="4" name="Constant_228713" type="Const" version="opset1">
33
+ <data element_type="i64" shape="" offset="12" size="8" />
34
+ <output>
35
+ <port id="0" precision="I64" />
36
+ </output>
37
+ </layer>
38
+ <layer id="5" name="StringTensorUnpack_228708" type="StringTensorUnpack" version="extension">
39
+ <data mode="begins_ends" />
40
+ <input>
41
+ <port id="0" precision="STRING">
42
+ <dim>-1</dim>
43
+ </port>
44
+ </input>
45
+ <output>
46
+ <port id="1" precision="I32">
47
+ <dim>-1</dim>
48
+ </port>
49
+ <port id="2" precision="I32">
50
+ <dim>-1</dim>
51
+ </port>
52
+ <port id="3" precision="U8">
53
+ <dim>-1</dim>
54
+ </port>
55
+ </output>
56
+ </layer>
57
+ <layer id="6" name="ShapeOf_228709" type="ShapeOf" version="opset3">
58
+ <data output_type="i64" />
59
+ <input>
60
+ <port id="0" precision="I32">
61
+ <dim>-1</dim>
62
+ </port>
63
+ </input>
64
+ <output>
65
+ <port id="1" precision="I64">
66
+ <dim>1</dim>
67
+ </port>
68
+ </output>
69
+ </layer>
70
+ <layer id="7" name="Constant_228710" type="Const" version="opset1">
71
+ <data element_type="i64" shape="" offset="12" size="8" />
72
+ <output>
73
+ <port id="0" precision="I64" />
74
+ </output>
75
+ </layer>
76
+ <layer id="8" name="Constant_228711" type="Const" version="opset1">
77
+ <data element_type="i64" shape="" offset="12" size="8" />
78
+ <output>
79
+ <port id="0" precision="I64" />
80
+ </output>
81
+ </layer>
82
+ <layer id="9" name="Gather_228712" type="Gather" version="opset8">
83
+ <data batch_dims="0" />
84
+ <input>
85
+ <port id="0" precision="I64">
86
+ <dim>1</dim>
87
+ </port>
88
+ <port id="1" precision="I64" />
89
+ <port id="2" precision="I64" />
90
+ </input>
91
+ <output>
92
+ <port id="3" precision="I64" />
93
+ </output>
94
+ </layer>
95
+ <layer id="10" name="Constant_228714" type="Const" version="opset1">
96
+ <data element_type="i64" shape="" offset="20" size="8" />
97
+ <output>
98
+ <port id="0" precision="I64" />
99
+ </output>
100
+ </layer>
101
+ <layer id="11" name="Range_228715" type="Range" version="opset4">
102
+ <data output_type="i32" />
103
+ <input>
104
+ <port id="0" precision="I64" />
105
+ <port id="1" precision="I64" />
106
+ <port id="2" precision="I64" />
107
+ </input>
108
+ <output>
109
+ <port id="3" precision="I32">
110
+ <dim>-1</dim>
111
+ </port>
112
+ </output>
113
+ </layer>
114
+ <layer id="12" name="Constant_228717" type="Const" version="opset1">
115
+ <data element_type="i64" shape="" offset="20" size="8" />
116
+ <output>
117
+ <port id="0" precision="I64" />
118
+ </output>
119
+ </layer>
120
+ <layer id="13" name="Constant_228718" type="Const" version="opset1">
121
+ <data element_type="i64" shape="" offset="20" size="8" />
122
+ <output>
123
+ <port id="0" precision="I64" />
124
+ </output>
125
+ </layer>
126
+ <layer id="14" name="Add_228719" type="Add" version="opset1">
127
+ <data auto_broadcast="numpy" />
128
+ <input>
129
+ <port id="0" precision="I64" />
130
+ <port id="1" precision="I64" />
131
+ </input>
132
+ <output>
133
+ <port id="2" precision="I64" />
134
+ </output>
135
+ </layer>
136
+ <layer id="15" name="Constant_228720" type="Const" version="opset1">
137
+ <data element_type="i64" shape="" offset="20" size="8" />
138
+ <output>
139
+ <port id="0" precision="I64" />
140
+ </output>
141
+ </layer>
142
+ <layer id="16" name="Range_228721" type="Range" version="opset4">
143
+ <data output_type="i32" />
144
+ <input>
145
+ <port id="0" precision="I64" />
146
+ <port id="1" precision="I64" />
147
+ <port id="2" precision="I64" />
148
+ </input>
149
+ <output>
150
+ <port id="3" precision="I32">
151
+ <dim>-1</dim>
152
+ </port>
153
+ </output>
154
+ </layer>
155
+ <layer id="17" name="Constant_228784" type="Const" version="opset1">
156
+ <data element_type="u8" shape="8668" offset="28" size="8668" />
157
+ <output>
158
+ <port id="0" precision="U8">
159
+ <dim>8668</dim>
160
+ </port>
161
+ </output>
162
+ </layer>
163
+ <layer id="18" name="RegexSplit_228785" type="RegexSplit" version="extension">
164
+ <data behaviour="isolate" invert="false" max_splits="-1" />
165
+ <input>
166
+ <port id="0" precision="I32">
167
+ <dim>-1</dim>
168
+ </port>
169
+ <port id="1" precision="I32">
170
+ <dim>-1</dim>
171
+ </port>
172
+ <port id="2" precision="I32">
173
+ <dim>-1</dim>
174
+ </port>
175
+ <port id="3" precision="I32">
176
+ <dim>-1</dim>
177
+ </port>
178
+ <port id="4" precision="U8">
179
+ <dim>-1</dim>
180
+ </port>
181
+ <port id="5" precision="U8">
182
+ <dim>8668</dim>
183
+ </port>
184
+ </input>
185
+ <output>
186
+ <port id="6" precision="I32">
187
+ <dim>-1</dim>
188
+ </port>
189
+ <port id="7" precision="I32">
190
+ <dim>-1</dim>
191
+ </port>
192
+ <port id="8" precision="I32">
193
+ <dim>-1</dim>
194
+ </port>
195
+ <port id="9" precision="I32">
196
+ <dim>-1</dim>
197
+ </port>
198
+ <port id="10" precision="U8">
199
+ <dim>-1</dim>
200
+ </port>
201
+ </output>
202
+ </layer>
203
+ <layer id="19" name="Constant_228790" type="Const" version="opset1">
204
+ <data element_type="u8" shape="120" offset="8696" size="120" />
205
+ <output>
206
+ <port id="0" precision="U8">
207
+ <dim>120</dim>
208
+ </port>
209
+ </output>
210
+ </layer>
211
+ <layer id="20" name="Constant_228787" type="Const" version="opset1">
212
+ <data element_type="u8" shape="8429" offset="8816" size="8429" />
213
+ <output>
214
+ <port id="0" precision="U8">
215
+ <dim>8429</dim>
216
+ </port>
217
+ </output>
218
+ </layer>
219
+ <layer id="21" name="StringTensorUnpack_228788" type="StringTensorUnpack" version="extension">
220
+ <data mode="begins_ends" />
221
+ <input>
222
+ <port id="0" precision="U8">
223
+ <dim>8429</dim>
224
+ </port>
225
+ </input>
226
+ <output>
227
+ <port id="1" precision="I32">
228
+ <dim>-1</dim>
229
+ </port>
230
+ <port id="2" precision="I32">
231
+ <dim>-1</dim>
232
+ </port>
233
+ <port id="3" precision="U8">
234
+ <dim>-1</dim>
235
+ </port>
236
+ </output>
237
+ </layer>
238
+ <layer id="22" name="RegexSplit_228791" type="RegexSplit" version="extension">
239
+ <data behaviour="isolate" invert="false" max_splits="-1" />
240
+ <input>
241
+ <port id="0" precision="I32">
242
+ <dim>-1</dim>
243
+ </port>
244
+ <port id="1" precision="I32">
245
+ <dim>-1</dim>
246
+ </port>
247
+ <port id="2" precision="I32">
248
+ <dim>-1</dim>
249
+ </port>
250
+ <port id="3" precision="I32">
251
+ <dim>-1</dim>
252
+ </port>
253
+ <port id="4" precision="U8">
254
+ <dim>-1</dim>
255
+ </port>
256
+ <port id="5" precision="U8">
257
+ <dim>120</dim>
258
+ </port>
259
+ <port id="6" precision="I32">
260
+ <dim>-1</dim>
261
+ </port>
262
+ <port id="7" precision="I32">
263
+ <dim>-1</dim>
264
+ </port>
265
+ <port id="8" precision="U8">
266
+ <dim>-1</dim>
267
+ </port>
268
+ </input>
269
+ <output>
270
+ <port id="9" precision="I32">
271
+ <dim>-1</dim>
272
+ </port>
273
+ <port id="10" precision="I32">
274
+ <dim>-1</dim>
275
+ </port>
276
+ <port id="11" precision="I32">
277
+ <dim>-1</dim>
278
+ </port>
279
+ <port id="12" precision="I32">
280
+ <dim>-1</dim>
281
+ </port>
282
+ <port id="13" precision="U8">
283
+ <dim>-1</dim>
284
+ </port>
285
+ </output>
286
+ </layer>
287
+ <layer id="23" name="BytesToChars_228792" type="BytesToChars" version="extension">
288
+ <input>
289
+ <port id="0" precision="I32">
290
+ <dim>-1</dim>
291
+ </port>
292
+ <port id="1" precision="I32">
293
+ <dim>-1</dim>
294
+ </port>
295
+ <port id="2" precision="I32">
296
+ <dim>-1</dim>
297
+ </port>
298
+ <port id="3" precision="I32">
299
+ <dim>-1</dim>
300
+ </port>
301
+ <port id="4" precision="U8">
302
+ <dim>-1</dim>
303
+ </port>
304
+ </input>
305
+ <output>
306
+ <port id="5" precision="I32">
307
+ <dim>-1</dim>
308
+ </port>
309
+ <port id="6" precision="I32">
310
+ <dim>-1</dim>
311
+ </port>
312
+ <port id="7" precision="I32">
313
+ <dim>-1</dim>
314
+ </port>
315
+ <port id="8" precision="I32">
316
+ <dim>-1</dim>
317
+ </port>
318
+ <port id="9" precision="U8">
319
+ <dim>-1</dim>
320
+ </port>
321
+ </output>
322
+ </layer>
323
+ <layer id="24" name="Constant_228794" type="Const" version="opset1">
324
+ <data element_type="u8" shape="1586376" offset="17245" size="1586376" />
325
+ <output>
326
+ <port id="0" precision="U8">
327
+ <dim>1586376</dim>
328
+ </port>
329
+ </output>
330
+ </layer>
331
+ <layer id="25" name="StringTensorUnpack_228795" type="StringTensorUnpack" version="extension">
332
+ <data mode="begins_ends" />
333
+ <input>
334
+ <port id="0" precision="U8">
335
+ <dim>1586376</dim>
336
+ </port>
337
+ </input>
338
+ <output>
339
+ <port id="1" precision="I32">
340
+ <dim>-1</dim>
341
+ </port>
342
+ <port id="2" precision="I32">
343
+ <dim>-1</dim>
344
+ </port>
345
+ <port id="3" precision="U8">
346
+ <dim>-1</dim>
347
+ </port>
348
+ </output>
349
+ </layer>
350
+ <layer id="26" name="Constant_228875" type="Const" version="opset1">
351
+ <data element_type="u8" shape="4084185" offset="1603621" size="4084185" />
352
+ <output>
353
+ <port id="0" precision="U8">
354
+ <dim>4084185</dim>
355
+ </port>
356
+ </output>
357
+ </layer>
358
+ <layer id="27" name="StringTensorUnpack_228876" type="StringTensorUnpack" version="extension">
359
+ <data mode="begins_ends" />
360
+ <input>
361
+ <port id="0" precision="U8">
362
+ <dim>4084185</dim>
363
+ </port>
364
+ </input>
365
+ <output>
366
+ <port id="1" precision="I32">
367
+ <dim>-1</dim>
368
+ </port>
369
+ <port id="2" precision="I32">
370
+ <dim>-1</dim>
371
+ </port>
372
+ <port id="3" precision="U8">
373
+ <dim>-1</dim>
374
+ </port>
375
+ </output>
376
+ </layer>
377
+ <layer id="28" name="Constant_228803" type="Const" version="opset1">
378
+ <data element_type="i64" shape="" offset="12" size="8" />
379
+ <output>
380
+ <port id="0" precision="I64" />
381
+ </output>
382
+ </layer>
383
+ <layer id="29" name="Constant_228797" type="Const" version="opset1">
384
+ <data element_type="u8" shape="8429" offset="8816" size="8429" />
385
+ <output>
386
+ <port id="0" precision="U8">
387
+ <dim>8429</dim>
388
+ </port>
389
+ </output>
390
+ </layer>
391
+ <layer id="30" name="StringTensorUnpack_228798" type="StringTensorUnpack" version="extension">
392
+ <data mode="begins_ends" />
393
+ <input>
394
+ <port id="0" precision="U8">
395
+ <dim>8429</dim>
396
+ </port>
397
+ </input>
398
+ <output>
399
+ <port id="1" precision="I32">
400
+ <dim>-1</dim>
401
+ </port>
402
+ <port id="2" precision="I32">
403
+ <dim>-1</dim>
404
+ </port>
405
+ <port id="3" precision="U8">
406
+ <dim>-1</dim>
407
+ </port>
408
+ </output>
409
+ </layer>
410
+ <layer id="31" name="ShapeOf_228799" type="ShapeOf" version="opset3">
411
+ <data output_type="i64" />
412
+ <input>
413
+ <port id="0" precision="I32">
414
+ <dim>-1</dim>
415
+ </port>
416
+ </input>
417
+ <output>
418
+ <port id="1" precision="I64">
419
+ <dim>1</dim>
420
+ </port>
421
+ </output>
422
+ </layer>
423
+ <layer id="32" name="Constant_228800" type="Const" version="opset1">
424
+ <data element_type="i64" shape="" offset="12" size="8" />
425
+ <output>
426
+ <port id="0" precision="I64" />
427
+ </output>
428
+ </layer>
429
+ <layer id="33" name="Constant_228801" type="Const" version="opset1">
430
+ <data element_type="i64" shape="" offset="12" size="8" />
431
+ <output>
432
+ <port id="0" precision="I64" />
433
+ </output>
434
+ </layer>
435
+ <layer id="34" name="Gather_228802" type="Gather" version="opset8">
436
+ <data batch_dims="0" />
437
+ <input>
438
+ <port id="0" precision="I64">
439
+ <dim>1</dim>
440
+ </port>
441
+ <port id="1" precision="I64" />
442
+ <port id="2" precision="I64" />
443
+ </input>
444
+ <output>
445
+ <port id="3" precision="I64" />
446
+ </output>
447
+ </layer>
448
+ <layer id="35" name="Constant_228804" type="Const" version="opset1">
449
+ <data element_type="i64" shape="" offset="20" size="8" />
450
+ <output>
451
+ <port id="0" precision="I64" />
452
+ </output>
453
+ </layer>
454
+ <layer id="36" name="Range_228805" type="Range" version="opset4">
455
+ <data output_type="i32" />
456
+ <input>
457
+ <port id="0" precision="I64" />
458
+ <port id="1" precision="I64" />
459
+ <port id="2" precision="I64" />
460
+ </input>
461
+ <output>
462
+ <port id="3" precision="I32">
463
+ <dim>-1</dim>
464
+ </port>
465
+ </output>
466
+ </layer>
467
+ <layer id="37" name="Constant_228807" type="Const" version="opset1">
468
+ <data element_type="i64" shape="" offset="20" size="8" />
469
+ <output>
470
+ <port id="0" precision="I64" />
471
+ </output>
472
+ </layer>
473
+ <layer id="38" name="Constant_228808" type="Const" version="opset1">
474
+ <data element_type="i64" shape="" offset="20" size="8" />
475
+ <output>
476
+ <port id="0" precision="I64" />
477
+ </output>
478
+ </layer>
479
+ <layer id="39" name="Add_228809" type="Add" version="opset1">
480
+ <data auto_broadcast="numpy" />
481
+ <input>
482
+ <port id="0" precision="I64" />
483
+ <port id="1" precision="I64" />
484
+ </input>
485
+ <output>
486
+ <port id="2" precision="I64" />
487
+ </output>
488
+ </layer>
489
+ <layer id="40" name="Constant_228810" type="Const" version="opset1">
490
+ <data element_type="i64" shape="" offset="20" size="8" />
491
+ <output>
492
+ <port id="0" precision="I64" />
493
+ </output>
494
+ </layer>
495
+ <layer id="41" name="Range_228811" type="Range" version="opset4">
496
+ <data output_type="i32" />
497
+ <input>
498
+ <port id="0" precision="I64" />
499
+ <port id="1" precision="I64" />
500
+ <port id="2" precision="I64" />
501
+ </input>
502
+ <output>
503
+ <port id="3" precision="I32">
504
+ <dim>-1</dim>
505
+ </port>
506
+ </output>
507
+ </layer>
508
+ <layer id="42" name="BytesToChars_228873" type="BytesToChars" version="extension">
509
+ <input>
510
+ <port id="0" precision="I32">
511
+ <dim>-1</dim>
512
+ </port>
513
+ <port id="1" precision="I32">
514
+ <dim>-1</dim>
515
+ </port>
516
+ <port id="2" precision="I32">
517
+ <dim>-1</dim>
518
+ </port>
519
+ <port id="3" precision="I32">
520
+ <dim>-1</dim>
521
+ </port>
522
+ <port id="4" precision="U8">
523
+ <dim>-1</dim>
524
+ </port>
525
+ </input>
526
+ <output>
527
+ <port id="5" precision="I32">
528
+ <dim>-1</dim>
529
+ </port>
530
+ <port id="6" precision="I32">
531
+ <dim>-1</dim>
532
+ </port>
533
+ <port id="7" precision="I32">
534
+ <dim>-1</dim>
535
+ </port>
536
+ <port id="8" precision="I32">
537
+ <dim>-1</dim>
538
+ </port>
539
+ <port id="9" precision="U8">
540
+ <dim>-1</dim>
541
+ </port>
542
+ </output>
543
+ </layer>
544
+ <layer id="43" name="Constant_228877" type="Const" version="opset1">
545
+ <data element_type="i32" shape="257" offset="5687806" size="1028" />
546
+ <output>
547
+ <port id="0" precision="I32">
548
+ <dim>257</dim>
549
+ </port>
550
+ </output>
551
+ </layer>
552
+ <layer id="44" name="BPETokenizer_228878" type="BPETokenizer" version="extension">
553
+ <data unk_token="" fuse_unk="false" suffix_indicator="" end_suffix="" byte_fallback="false" />
554
+ <input>
555
+ <port id="0" precision="I32">
556
+ <dim>-1</dim>
557
+ </port>
558
+ <port id="1" precision="I32">
559
+ <dim>-1</dim>
560
+ </port>
561
+ <port id="2" precision="I32">
562
+ <dim>-1</dim>
563
+ </port>
564
+ <port id="3" precision="I32">
565
+ <dim>-1</dim>
566
+ </port>
567
+ <port id="4" precision="U8">
568
+ <dim>-1</dim>
569
+ </port>
570
+ <port id="5" precision="I32">
571
+ <dim>-1</dim>
572
+ </port>
573
+ <port id="6" precision="I32">
574
+ <dim>-1</dim>
575
+ </port>
576
+ <port id="7" precision="U8">
577
+ <dim>-1</dim>
578
+ </port>
579
+ <port id="8" precision="I32">
580
+ <dim>-1</dim>
581
+ </port>
582
+ <port id="9" precision="I32">
583
+ <dim>-1</dim>
584
+ </port>
585
+ <port id="10" precision="U8">
586
+ <dim>-1</dim>
587
+ </port>
588
+ <port id="11" precision="I32">
589
+ <dim>-1</dim>
590
+ </port>
591
+ <port id="12" precision="I32">
592
+ <dim>-1</dim>
593
+ </port>
594
+ <port id="13" precision="U8">
595
+ <dim>-1</dim>
596
+ </port>
597
+ <port id="14" precision="I32">
598
+ <dim>257</dim>
599
+ </port>
600
+ </input>
601
+ <output>
602
+ <port id="15" precision="I32">
603
+ <dim>-1</dim>
604
+ </port>
605
+ <port id="16" precision="I32">
606
+ <dim>-1</dim>
607
+ </port>
608
+ <port id="17" precision="I32">
609
+ <dim>-1</dim>
610
+ </port>
611
+ </output>
612
+ </layer>
613
+ <layer id="45" name="Subtract_228879" type="Subtract" version="opset1">
614
+ <data auto_broadcast="numpy" />
615
+ <input>
616
+ <port id="0" precision="I32">
617
+ <dim>-1</dim>
618
+ </port>
619
+ <port id="1" precision="I32">
620
+ <dim>-1</dim>
621
+ </port>
622
+ </input>
623
+ <output>
624
+ <port id="2" precision="I32">
625
+ <dim>-1</dim>
626
+ </port>
627
+ </output>
628
+ </layer>
629
+ <layer id="46" name="Constant_228880" type="Const" version="opset1">
630
+ <data element_type="i32" shape="" offset="5688834" size="4" />
631
+ <output>
632
+ <port id="0" precision="I32" />
633
+ </output>
634
+ </layer>
635
+ <layer id="47" name="Minimum_228881" type="Minimum" version="opset1">
636
+ <data auto_broadcast="numpy" />
637
+ <input>
638
+ <port id="0" precision="I32">
639
+ <dim>-1</dim>
640
+ </port>
641
+ <port id="1" precision="I32" />
642
+ </input>
643
+ <output>
644
+ <port id="2" precision="I32">
645
+ <dim>-1</dim>
646
+ </port>
647
+ </output>
648
+ </layer>
649
+ <layer id="48" name="Add_228882" type="Add" version="opset1">
650
+ <data auto_broadcast="numpy" />
651
+ <input>
652
+ <port id="0" precision="I32">
653
+ <dim>-1</dim>
654
+ </port>
655
+ <port id="1" precision="I32">
656
+ <dim>-1</dim>
657
+ </port>
658
+ </input>
659
+ <output>
660
+ <port id="2" precision="I32">
661
+ <dim>-1</dim>
662
+ </port>
663
+ </output>
664
+ </layer>
665
+ <layer id="49" name="Constant_228886" type="Const" version="opset1">
666
+ <data element_type="i32" shape="2" offset="12" size="8" />
667
+ <output>
668
+ <port id="0" precision="I32">
669
+ <dim>2</dim>
670
+ </port>
671
+ </output>
672
+ </layer>
673
+ <layer id="50" name="CombineSegments_228887" type="CombineSegments" version="extension">
674
+ <input>
675
+ <port id="0" precision="I32" />
676
+ <port id="1" precision="I32" />
677
+ <port id="2" precision="I32">
678
+ <dim>1</dim>
679
+ </port>
680
+ <port id="3" precision="I32">
681
+ <dim>-1</dim>
682
+ </port>
683
+ <port id="4" precision="I32">
684
+ <dim>-1</dim>
685
+ </port>
686
+ <port id="5" precision="I32">
687
+ <dim>-1</dim>
688
+ </port>
689
+ <port id="6" precision="I32">
690
+ <dim>2</dim>
691
+ </port>
692
+ </input>
693
+ <output>
694
+ <port id="7" precision="I32">
695
+ <dim>-1</dim>
696
+ </port>
697
+ <port id="8" precision="I32">
698
+ <dim>-1</dim>
699
+ </port>
700
+ <port id="9" precision="I32">
701
+ <dim>-1</dim>
702
+ </port>
703
+ <port id="10" precision="I32">
704
+ <dim>-1</dim>
705
+ </port>
706
+ <port id="11" precision="I32">
707
+ <dim>-1</dim>
708
+ </port>
709
+ <port id="12" precision="I32">
710
+ <dim>-1</dim>
711
+ </port>
712
+ </output>
713
+ </layer>
714
+ <layer id="51" name="Subtract_228888" type="Subtract" version="opset1">
715
+ <data auto_broadcast="numpy" />
716
+ <input>
717
+ <port id="0" precision="I32">
718
+ <dim>-1</dim>
719
+ </port>
720
+ <port id="1" precision="I32">
721
+ <dim>-1</dim>
722
+ </port>
723
+ </input>
724
+ <output>
725
+ <port id="2" precision="I32">
726
+ <dim>-1</dim>
727
+ </port>
728
+ </output>
729
+ </layer>
730
+ <layer id="52" name="Constant_228889" type="Const" version="opset1">
731
+ <data element_type="i32" shape="" offset="0" size="4" />
732
+ <output>
733
+ <port id="0" precision="I32" />
734
+ </output>
735
+ </layer>
736
+ <layer id="53" name="ReduceMax_228890" type="ReduceMax" version="opset1">
737
+ <data keep_dims="false" />
738
+ <input>
739
+ <port id="0" precision="I32">
740
+ <dim>-1</dim>
741
+ </port>
742
+ <port id="1" precision="I32" />
743
+ </input>
744
+ <output>
745
+ <port id="2" precision="I32" />
746
+ </output>
747
+ </layer>
748
+ <layer id="54" name="Constant_228891" type="Const" version="opset1">
749
+ <data element_type="i32" shape="" offset="5688838" size="4" />
750
+ <output>
751
+ <port id="0" precision="I32" />
752
+ </output>
753
+ </layer>
754
+ <layer id="55" name="RaggedToDense_228892" type="RaggedToDense" version="extension">
755
+ <data pad_right="true" />
756
+ <input>
757
+ <port id="0" precision="I32">
758
+ <dim>-1</dim>
759
+ </port>
760
+ <port id="1" precision="I32">
761
+ <dim>-1</dim>
762
+ </port>
763
+ <port id="2" precision="I32">
764
+ <dim>-1</dim>
765
+ </port>
766
+ <port id="3" precision="I32" />
767
+ <port id="4" precision="I32" />
768
+ </input>
769
+ <output>
770
+ <port id="5" precision="I32">
771
+ <dim>-1</dim>
772
+ <dim>-1</dim>
773
+ </port>
774
+ <port id="6" precision="BOOL">
775
+ <dim>-1</dim>
776
+ <dim>-1</dim>
777
+ </port>
778
+ </output>
779
+ </layer>
780
+ <layer id="56" name="Convert_228893" type="Convert" version="opset1">
781
+ <data destination_type="i32" />
782
+ <input>
783
+ <port id="0" precision="BOOL">
784
+ <dim>-1</dim>
785
+ <dim>-1</dim>
786
+ </port>
787
+ </input>
788
+ <output>
789
+ <port id="1" precision="I32">
790
+ <dim>-1</dim>
791
+ <dim>-1</dim>
792
+ </port>
793
+ </output>
794
+ </layer>
795
+ <layer id="57" name="Convert_228893" type="Convert" version="opset1">
796
+ <data destination_type="i64" />
797
+ <input>
798
+ <port id="0" precision="I32">
799
+ <dim>-1</dim>
800
+ <dim>-1</dim>
801
+ </port>
802
+ </input>
803
+ <output>
804
+ <port id="1" precision="I64" names="attention_mask">
805
+ <dim>-1</dim>
806
+ <dim>-1</dim>
807
+ </port>
808
+ </output>
809
+ </layer>
810
+ <layer id="59" name="RaggedToDense_228892.0" type="Convert" version="opset1">
811
+ <data destination_type="i64" />
812
+ <input>
813
+ <port id="0" precision="I32">
814
+ <dim>-1</dim>
815
+ <dim>-1</dim>
816
+ </port>
817
+ </input>
818
+ <output>
819
+ <port id="1" precision="I64" names="input_ids">
820
+ <dim>-1</dim>
821
+ <dim>-1</dim>
822
+ </port>
823
+ </output>
824
+ </layer>
825
+ <layer id="60" name="Result_228896" type="Result" version="opset1">
826
+ <input>
827
+ <port id="0" precision="I64">
828
+ <dim>-1</dim>
829
+ <dim>-1</dim>
830
+ </port>
831
+ </input>
832
+ </layer>
833
+ <layer id="58" name="Result_228898" type="Result" version="opset1">
834
+ <input>
835
+ <port id="0" precision="I64">
836
+ <dim>-1</dim>
837
+ <dim>-1</dim>
838
+ </port>
839
+ </input>
840
+ </layer>
841
+ </layers>
842
+ <edges>
843
+ <edge from-layer="0" from-port="0" to-layer="5" to-port="0" />
844
+ <edge from-layer="1" from-port="0" to-layer="50" to-port="0" />
845
+ <edge from-layer="2" from-port="0" to-layer="50" to-port="1" />
846
+ <edge from-layer="3" from-port="0" to-layer="50" to-port="2" />
847
+ <edge from-layer="4" from-port="0" to-layer="11" to-port="0" />
848
+ <edge from-layer="5" from-port="1" to-layer="6" to-port="0" />
849
+ <edge from-layer="5" from-port="3" to-layer="18" to-port="4" />
850
+ <edge from-layer="5" from-port="2" to-layer="18" to-port="3" />
851
+ <edge from-layer="5" from-port="1" to-layer="18" to-port="2" />
852
+ <edge from-layer="6" from-port="1" to-layer="9" to-port="0" />
853
+ <edge from-layer="7" from-port="0" to-layer="9" to-port="1" />
854
+ <edge from-layer="8" from-port="0" to-layer="9" to-port="2" />
855
+ <edge from-layer="9" from-port="3" to-layer="14" to-port="0" />
856
+ <edge from-layer="9" from-port="3" to-layer="11" to-port="1" />
857
+ <edge from-layer="10" from-port="0" to-layer="11" to-port="2" />
858
+ <edge from-layer="11" from-port="3" to-layer="18" to-port="0" />
859
+ <edge from-layer="12" from-port="0" to-layer="16" to-port="0" />
860
+ <edge from-layer="13" from-port="0" to-layer="14" to-port="1" />
861
+ <edge from-layer="14" from-port="2" to-layer="16" to-port="1" />
862
+ <edge from-layer="15" from-port="0" to-layer="16" to-port="2" />
863
+ <edge from-layer="16" from-port="3" to-layer="18" to-port="1" />
864
+ <edge from-layer="17" from-port="0" to-layer="18" to-port="5" />
865
+ <edge from-layer="18" from-port="6" to-layer="22" to-port="0" />
866
+ <edge from-layer="18" from-port="7" to-layer="22" to-port="1" />
867
+ <edge from-layer="18" from-port="8" to-layer="22" to-port="2" />
868
+ <edge from-layer="18" from-port="9" to-layer="22" to-port="3" />
869
+ <edge from-layer="18" from-port="10" to-layer="22" to-port="4" />
870
+ <edge from-layer="19" from-port="0" to-layer="22" to-port="5" />
871
+ <edge from-layer="20" from-port="0" to-layer="21" to-port="0" />
872
+ <edge from-layer="21" from-port="1" to-layer="22" to-port="6" />
873
+ <edge from-layer="21" from-port="2" to-layer="22" to-port="7" />
874
+ <edge from-layer="21" from-port="3" to-layer="22" to-port="8" />
875
+ <edge from-layer="22" from-port="13" to-layer="23" to-port="4" />
876
+ <edge from-layer="22" from-port="9" to-layer="23" to-port="0" />
877
+ <edge from-layer="22" from-port="12" to-layer="23" to-port="3" />
878
+ <edge from-layer="22" from-port="11" to-layer="23" to-port="2" />
879
+ <edge from-layer="22" from-port="10" to-layer="23" to-port="1" />
880
+ <edge from-layer="23" from-port="7" to-layer="44" to-port="2" />
881
+ <edge from-layer="23" from-port="6" to-layer="44" to-port="1" />
882
+ <edge from-layer="23" from-port="5" to-layer="44" to-port="0" />
883
+ <edge from-layer="23" from-port="8" to-layer="44" to-port="3" />
884
+ <edge from-layer="23" from-port="9" to-layer="44" to-port="4" />
885
+ <edge from-layer="24" from-port="0" to-layer="25" to-port="0" />
886
+ <edge from-layer="25" from-port="3" to-layer="44" to-port="7" />
887
+ <edge from-layer="25" from-port="2" to-layer="44" to-port="6" />
888
+ <edge from-layer="25" from-port="1" to-layer="44" to-port="5" />
889
+ <edge from-layer="26" from-port="0" to-layer="27" to-port="0" />
890
+ <edge from-layer="27" from-port="1" to-layer="44" to-port="8" />
891
+ <edge from-layer="27" from-port="2" to-layer="44" to-port="9" />
892
+ <edge from-layer="27" from-port="3" to-layer="44" to-port="10" />
893
+ <edge from-layer="28" from-port="0" to-layer="36" to-port="0" />
894
+ <edge from-layer="29" from-port="0" to-layer="30" to-port="0" />
895
+ <edge from-layer="30" from-port="1" to-layer="31" to-port="0" />
896
+ <edge from-layer="30" from-port="1" to-layer="42" to-port="2" />
897
+ <edge from-layer="30" from-port="2" to-layer="42" to-port="3" />
898
+ <edge from-layer="30" from-port="3" to-layer="42" to-port="4" />
899
+ <edge from-layer="31" from-port="1" to-layer="34" to-port="0" />
900
+ <edge from-layer="32" from-port="0" to-layer="34" to-port="1" />
901
+ <edge from-layer="33" from-port="0" to-layer="34" to-port="2" />
902
+ <edge from-layer="34" from-port="3" to-layer="39" to-port="0" />
903
+ <edge from-layer="34" from-port="3" to-layer="36" to-port="1" />
904
+ <edge from-layer="35" from-port="0" to-layer="36" to-port="2" />
905
+ <edge from-layer="36" from-port="3" to-layer="42" to-port="0" />
906
+ <edge from-layer="37" from-port="0" to-layer="41" to-port="0" />
907
+ <edge from-layer="38" from-port="0" to-layer="39" to-port="1" />
908
+ <edge from-layer="39" from-port="2" to-layer="41" to-port="1" />
909
+ <edge from-layer="40" from-port="0" to-layer="41" to-port="2" />
910
+ <edge from-layer="41" from-port="3" to-layer="42" to-port="1" />
911
+ <edge from-layer="42" from-port="8" to-layer="44" to-port="12" />
912
+ <edge from-layer="42" from-port="7" to-layer="44" to-port="11" />
913
+ <edge from-layer="42" from-port="9" to-layer="44" to-port="13" />
914
+ <edge from-layer="43" from-port="0" to-layer="44" to-port="14" />
915
+ <edge from-layer="44" from-port="16" to-layer="45" to-port="0" />
916
+ <edge from-layer="44" from-port="15" to-layer="45" to-port="1" />
917
+ <edge from-layer="44" from-port="15" to-layer="48" to-port="0" />
918
+ <edge from-layer="44" from-port="15" to-layer="50" to-port="3" />
919
+ <edge from-layer="44" from-port="17" to-layer="50" to-port="5" />
920
+ <edge from-layer="45" from-port="2" to-layer="47" to-port="0" />
921
+ <edge from-layer="46" from-port="0" to-layer="47" to-port="1" />
922
+ <edge from-layer="47" from-port="2" to-layer="48" to-port="1" />
923
+ <edge from-layer="48" from-port="2" to-layer="50" to-port="4" />
924
+ <edge from-layer="49" from-port="0" to-layer="50" to-port="6" />
925
+ <edge from-layer="50" from-port="8" to-layer="51" to-port="0" />
926
+ <edge from-layer="50" from-port="7" to-layer="51" to-port="1" />
927
+ <edge from-layer="50" from-port="7" to-layer="55" to-port="0" />
928
+ <edge from-layer="50" from-port="8" to-layer="55" to-port="1" />
929
+ <edge from-layer="50" from-port="9" to-layer="55" to-port="2" />
930
+ <edge from-layer="51" from-port="2" to-layer="53" to-port="0" />
931
+ <edge from-layer="52" from-port="0" to-layer="53" to-port="1" />
932
+ <edge from-layer="53" from-port="2" to-layer="55" to-port="3" />
933
+ <edge from-layer="54" from-port="0" to-layer="55" to-port="4" />
934
+ <edge from-layer="55" from-port="6" to-layer="56" to-port="0" />
935
+ <edge from-layer="55" from-port="5" to-layer="59" to-port="0" />
936
+ <edge from-layer="56" from-port="1" to-layer="57" to-port="0" />
937
+ <edge from-layer="57" from-port="1" to-layer="58" to-port="0" />
938
+ <edge from-layer="59" from-port="1" to-layer="60" to-port="0" />
939
+ </edges>
940
+ <rt_info>
941
+ <eos_token_id value="128003" />
942
+ </rt_info>
943
+ </net>
special_tokens_map.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|begin_of_text|>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|im_end|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "<|end_of_text|>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ }
23
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,2071 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "128000": {
4
+ "content": "<|begin_of_text|>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "128001": {
12
+ "content": "<|end_of_text|>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "128002": {
20
+ "content": "<|im_start|>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "128003": {
28
+ "content": "<|im_end|>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "128004": {
36
+ "content": "<tool_call>",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": false
42
+ },
43
+ "128005": {
44
+ "content": "<|reserved_special_token_3|>",
45
+ "lstrip": false,
46
+ "normalized": false,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": false
50
+ },
51
+ "128006": {
52
+ "content": "<|start_header_id|>",
53
+ "lstrip": false,
54
+ "normalized": false,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": true
58
+ },
59
+ "128007": {
60
+ "content": "<|end_header_id|>",
61
+ "lstrip": false,
62
+ "normalized": false,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": true
66
+ },
67
+ "128008": {
68
+ "content": "<tools>",
69
+ "lstrip": false,
70
+ "normalized": false,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": false
74
+ },
75
+ "128009": {
76
+ "content": "<|eot_id|>",
77
+ "lstrip": false,
78
+ "normalized": false,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": true
82
+ },
83
+ "128010": {
84
+ "content": "</tools>",
85
+ "lstrip": false,
86
+ "normalized": false,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": false
90
+ },
91
+ "128011": {
92
+ "content": "</tool_call>",
93
+ "lstrip": false,
94
+ "normalized": false,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": false
98
+ },
99
+ "128012": {
100
+ "content": "</tool_response>",
101
+ "lstrip": false,
102
+ "normalized": false,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": false
106
+ },
107
+ "128013": {
108
+ "content": "<|reserved_special_token_8|>",
109
+ "lstrip": false,
110
+ "normalized": false,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": true
114
+ },
115
+ "128014": {
116
+ "content": "<|reserved_special_token_9|>",
117
+ "lstrip": false,
118
+ "normalized": false,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": true
122
+ },
123
+ "128015": {
124
+ "content": "<|reserved_special_token_10|>",
125
+ "lstrip": false,
126
+ "normalized": false,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": true
130
+ },
131
+ "128016": {
132
+ "content": "<|reserved_special_token_11|>",
133
+ "lstrip": false,
134
+ "normalized": false,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": true
138
+ },
139
+ "128017": {
140
+ "content": "<|reserved_special_token_12|>",
141
+ "lstrip": false,
142
+ "normalized": false,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": true
146
+ },
147
+ "128018": {
148
+ "content": "<|reserved_special_token_13|>",
149
+ "lstrip": false,
150
+ "normalized": false,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": true
154
+ },
155
+ "128019": {
156
+ "content": "<|reserved_special_token_14|>",
157
+ "lstrip": false,
158
+ "normalized": false,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": true
162
+ },
163
+ "128020": {
164
+ "content": "<|reserved_special_token_15|>",
165
+ "lstrip": false,
166
+ "normalized": false,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": true
170
+ },
171
+ "128021": {
172
+ "content": "<|reserved_special_token_16|>",
173
+ "lstrip": false,
174
+ "normalized": false,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": true
178
+ },
179
+ "128022": {
180
+ "content": "<|reserved_special_token_17|>",
181
+ "lstrip": false,
182
+ "normalized": false,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": true
186
+ },
187
+ "128023": {
188
+ "content": "<|reserved_special_token_18|>",
189
+ "lstrip": false,
190
+ "normalized": false,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": true
194
+ },
195
+ "128024": {
196
+ "content": "<|reserved_special_token_19|>",
197
+ "lstrip": false,
198
+ "normalized": false,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": true
202
+ },
203
+ "128025": {
204
+ "content": "<|reserved_special_token_20|>",
205
+ "lstrip": false,
206
+ "normalized": false,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": true
210
+ },
211
+ "128026": {
212
+ "content": "<|reserved_special_token_21|>",
213
+ "lstrip": false,
214
+ "normalized": false,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": true
218
+ },
219
+ "128027": {
220
+ "content": "<|reserved_special_token_22|>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "128028": {
228
+ "content": "<|reserved_special_token_23|>",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "128029": {
236
+ "content": "<|reserved_special_token_24|>",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "128030": {
244
+ "content": "<|reserved_special_token_25|>",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "128031": {
252
+ "content": "<|reserved_special_token_26|>",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "128032": {
260
+ "content": "<|reserved_special_token_27|>",
261
+ "lstrip": false,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "128033": {
268
+ "content": "<|reserved_special_token_28|>",
269
+ "lstrip": false,
270
+ "normalized": false,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": true
274
+ },
275
+ "128034": {
276
+ "content": "<|reserved_special_token_29|>",
277
+ "lstrip": false,
278
+ "normalized": false,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": true
282
+ },
283
+ "128035": {
284
+ "content": "<|reserved_special_token_30|>",
285
+ "lstrip": false,
286
+ "normalized": false,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": true
290
+ },
291
+ "128036": {
292
+ "content": "<|reserved_special_token_31|>",
293
+ "lstrip": false,
294
+ "normalized": false,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": true
298
+ },
299
+ "128037": {
300
+ "content": "<|reserved_special_token_32|>",
301
+ "lstrip": false,
302
+ "normalized": false,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": true
306
+ },
307
+ "128038": {
308
+ "content": "<|reserved_special_token_33|>",
309
+ "lstrip": false,
310
+ "normalized": false,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": true
314
+ },
315
+ "128039": {
316
+ "content": "<|reserved_special_token_34|>",
317
+ "lstrip": false,
318
+ "normalized": false,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": true
322
+ },
323
+ "128040": {
324
+ "content": "<|reserved_special_token_35|>",
325
+ "lstrip": false,
326
+ "normalized": false,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": true
330
+ },
331
+ "128041": {
332
+ "content": "<|reserved_special_token_36|>",
333
+ "lstrip": false,
334
+ "normalized": false,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": true
338
+ },
339
+ "128042": {
340
+ "content": "<|reserved_special_token_37|>",
341
+ "lstrip": false,
342
+ "normalized": false,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": true
346
+ },
347
+ "128043": {
348
+ "content": "<|reserved_special_token_38|>",
349
+ "lstrip": false,
350
+ "normalized": false,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": true
354
+ },
355
+ "128044": {
356
+ "content": "<|reserved_special_token_39|>",
357
+ "lstrip": false,
358
+ "normalized": false,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": true
362
+ },
363
+ "128045": {
364
+ "content": "<|reserved_special_token_40|>",
365
+ "lstrip": false,
366
+ "normalized": false,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": true
370
+ },
371
+ "128046": {
372
+ "content": "<|reserved_special_token_41|>",
373
+ "lstrip": false,
374
+ "normalized": false,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": true
378
+ },
379
+ "128047": {
380
+ "content": "<|reserved_special_token_42|>",
381
+ "lstrip": false,
382
+ "normalized": false,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": true
386
+ },
387
+ "128048": {
388
+ "content": "<|reserved_special_token_43|>",
389
+ "lstrip": false,
390
+ "normalized": false,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": true
394
+ },
395
+ "128049": {
396
+ "content": "<|reserved_special_token_44|>",
397
+ "lstrip": false,
398
+ "normalized": false,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": true
402
+ },
403
+ "128050": {
404
+ "content": "<|reserved_special_token_45|>",
405
+ "lstrip": false,
406
+ "normalized": false,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": true
410
+ },
411
+ "128051": {
412
+ "content": "<|reserved_special_token_46|>",
413
+ "lstrip": false,
414
+ "normalized": false,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": true
418
+ },
419
+ "128052": {
420
+ "content": "<|reserved_special_token_47|>",
421
+ "lstrip": false,
422
+ "normalized": false,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": true
426
+ },
427
+ "128053": {
428
+ "content": "<|reserved_special_token_48|>",
429
+ "lstrip": false,
430
+ "normalized": false,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": true
434
+ },
435
+ "128054": {
436
+ "content": "<|reserved_special_token_49|>",
437
+ "lstrip": false,
438
+ "normalized": false,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": true
442
+ },
443
+ "128055": {
444
+ "content": "<|reserved_special_token_50|>",
445
+ "lstrip": false,
446
+ "normalized": false,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": true
450
+ },
451
+ "128056": {
452
+ "content": "<|reserved_special_token_51|>",
453
+ "lstrip": false,
454
+ "normalized": false,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": true
458
+ },
459
+ "128057": {
460
+ "content": "<|reserved_special_token_52|>",
461
+ "lstrip": false,
462
+ "normalized": false,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": true
466
+ },
467
+ "128058": {
468
+ "content": "<|reserved_special_token_53|>",
469
+ "lstrip": false,
470
+ "normalized": false,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": true
474
+ },
475
+ "128059": {
476
+ "content": "<|reserved_special_token_54|>",
477
+ "lstrip": false,
478
+ "normalized": false,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": true
482
+ },
483
+ "128060": {
484
+ "content": "<|reserved_special_token_55|>",
485
+ "lstrip": false,
486
+ "normalized": false,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": true
490
+ },
491
+ "128061": {
492
+ "content": "<|reserved_special_token_56|>",
493
+ "lstrip": false,
494
+ "normalized": false,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": true
498
+ },
499
+ "128062": {
500
+ "content": "<|reserved_special_token_57|>",
501
+ "lstrip": false,
502
+ "normalized": false,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": true
506
+ },
507
+ "128063": {
508
+ "content": "<|reserved_special_token_58|>",
509
+ "lstrip": false,
510
+ "normalized": false,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": true
514
+ },
515
+ "128064": {
516
+ "content": "<|reserved_special_token_59|>",
517
+ "lstrip": false,
518
+ "normalized": false,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": true
522
+ },
523
+ "128065": {
524
+ "content": "<|reserved_special_token_60|>",
525
+ "lstrip": false,
526
+ "normalized": false,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": true
530
+ },
531
+ "128066": {
532
+ "content": "<|reserved_special_token_61|>",
533
+ "lstrip": false,
534
+ "normalized": false,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": true
538
+ },
539
+ "128067": {
540
+ "content": "<|reserved_special_token_62|>",
541
+ "lstrip": false,
542
+ "normalized": false,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": true
546
+ },
547
+ "128068": {
548
+ "content": "<|reserved_special_token_63|>",
549
+ "lstrip": false,
550
+ "normalized": false,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": true
554
+ },
555
+ "128069": {
556
+ "content": "<|reserved_special_token_64|>",
557
+ "lstrip": false,
558
+ "normalized": false,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": true
562
+ },
563
+ "128070": {
564
+ "content": "<|reserved_special_token_65|>",
565
+ "lstrip": false,
566
+ "normalized": false,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": true
570
+ },
571
+ "128071": {
572
+ "content": "<|reserved_special_token_66|>",
573
+ "lstrip": false,
574
+ "normalized": false,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": true
578
+ },
579
+ "128072": {
580
+ "content": "<|reserved_special_token_67|>",
581
+ "lstrip": false,
582
+ "normalized": false,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": true
586
+ },
587
+ "128073": {
588
+ "content": "<|reserved_special_token_68|>",
589
+ "lstrip": false,
590
+ "normalized": false,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": true
594
+ },
595
+ "128074": {
596
+ "content": "<|reserved_special_token_69|>",
597
+ "lstrip": false,
598
+ "normalized": false,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": true
602
+ },
603
+ "128075": {
604
+ "content": "<|reserved_special_token_70|>",
605
+ "lstrip": false,
606
+ "normalized": false,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": true
610
+ },
611
+ "128076": {
612
+ "content": "<|reserved_special_token_71|>",
613
+ "lstrip": false,
614
+ "normalized": false,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": true
618
+ },
619
+ "128077": {
620
+ "content": "<|reserved_special_token_72|>",
621
+ "lstrip": false,
622
+ "normalized": false,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": true
626
+ },
627
+ "128078": {
628
+ "content": "<|reserved_special_token_73|>",
629
+ "lstrip": false,
630
+ "normalized": false,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": true
634
+ },
635
+ "128079": {
636
+ "content": "<|reserved_special_token_74|>",
637
+ "lstrip": false,
638
+ "normalized": false,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": true
642
+ },
643
+ "128080": {
644
+ "content": "<|reserved_special_token_75|>",
645
+ "lstrip": false,
646
+ "normalized": false,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": true
650
+ },
651
+ "128081": {
652
+ "content": "<|reserved_special_token_76|>",
653
+ "lstrip": false,
654
+ "normalized": false,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": true
658
+ },
659
+ "128082": {
660
+ "content": "<|reserved_special_token_77|>",
661
+ "lstrip": false,
662
+ "normalized": false,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": true
666
+ },
667
+ "128083": {
668
+ "content": "<|reserved_special_token_78|>",
669
+ "lstrip": false,
670
+ "normalized": false,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": true
674
+ },
675
+ "128084": {
676
+ "content": "<|reserved_special_token_79|>",
677
+ "lstrip": false,
678
+ "normalized": false,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": true
682
+ },
683
+ "128085": {
684
+ "content": "<|reserved_special_token_80|>",
685
+ "lstrip": false,
686
+ "normalized": false,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": true
690
+ },
691
+ "128086": {
692
+ "content": "<|reserved_special_token_81|>",
693
+ "lstrip": false,
694
+ "normalized": false,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": true
698
+ },
699
+ "128087": {
700
+ "content": "<|reserved_special_token_82|>",
701
+ "lstrip": false,
702
+ "normalized": false,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": true
706
+ },
707
+ "128088": {
708
+ "content": "<|reserved_special_token_83|>",
709
+ "lstrip": false,
710
+ "normalized": false,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": true
714
+ },
715
+ "128089": {
716
+ "content": "<|reserved_special_token_84|>",
717
+ "lstrip": false,
718
+ "normalized": false,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": true
722
+ },
723
+ "128090": {
724
+ "content": "<|reserved_special_token_85|>",
725
+ "lstrip": false,
726
+ "normalized": false,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": true
730
+ },
731
+ "128091": {
732
+ "content": "<|reserved_special_token_86|>",
733
+ "lstrip": false,
734
+ "normalized": false,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": true
738
+ },
739
+ "128092": {
740
+ "content": "<|reserved_special_token_87|>",
741
+ "lstrip": false,
742
+ "normalized": false,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": true
746
+ },
747
+ "128093": {
748
+ "content": "<|reserved_special_token_88|>",
749
+ "lstrip": false,
750
+ "normalized": false,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": true
754
+ },
755
+ "128094": {
756
+ "content": "<|reserved_special_token_89|>",
757
+ "lstrip": false,
758
+ "normalized": false,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": true
762
+ },
763
+ "128095": {
764
+ "content": "<|reserved_special_token_90|>",
765
+ "lstrip": false,
766
+ "normalized": false,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": true
770
+ },
771
+ "128096": {
772
+ "content": "<|reserved_special_token_91|>",
773
+ "lstrip": false,
774
+ "normalized": false,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": true
778
+ },
779
+ "128097": {
780
+ "content": "<|reserved_special_token_92|>",
781
+ "lstrip": false,
782
+ "normalized": false,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": true
786
+ },
787
+ "128098": {
788
+ "content": "<|reserved_special_token_93|>",
789
+ "lstrip": false,
790
+ "normalized": false,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": true
794
+ },
795
+ "128099": {
796
+ "content": "<|reserved_special_token_94|>",
797
+ "lstrip": false,
798
+ "normalized": false,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": true
802
+ },
803
+ "128100": {
804
+ "content": "<|reserved_special_token_95|>",
805
+ "lstrip": false,
806
+ "normalized": false,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": true
810
+ },
811
+ "128101": {
812
+ "content": "<|reserved_special_token_96|>",
813
+ "lstrip": false,
814
+ "normalized": false,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": true
818
+ },
819
+ "128102": {
820
+ "content": "<|reserved_special_token_97|>",
821
+ "lstrip": false,
822
+ "normalized": false,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": true
826
+ },
827
+ "128103": {
828
+ "content": "<|reserved_special_token_98|>",
829
+ "lstrip": false,
830
+ "normalized": false,
831
+ "rstrip": false,
832
+ "single_word": false,
833
+ "special": true
834
+ },
835
+ "128104": {
836
+ "content": "<|reserved_special_token_99|>",
837
+ "lstrip": false,
838
+ "normalized": false,
839
+ "rstrip": false,
840
+ "single_word": false,
841
+ "special": true
842
+ },
843
+ "128105": {
844
+ "content": "<|reserved_special_token_100|>",
845
+ "lstrip": false,
846
+ "normalized": false,
847
+ "rstrip": false,
848
+ "single_word": false,
849
+ "special": true
850
+ },
851
+ "128106": {
852
+ "content": "<|reserved_special_token_101|>",
853
+ "lstrip": false,
854
+ "normalized": false,
855
+ "rstrip": false,
856
+ "single_word": false,
857
+ "special": true
858
+ },
859
+ "128107": {
860
+ "content": "<|reserved_special_token_102|>",
861
+ "lstrip": false,
862
+ "normalized": false,
863
+ "rstrip": false,
864
+ "single_word": false,
865
+ "special": true
866
+ },
867
+ "128108": {
868
+ "content": "<|reserved_special_token_103|>",
869
+ "lstrip": false,
870
+ "normalized": false,
871
+ "rstrip": false,
872
+ "single_word": false,
873
+ "special": true
874
+ },
875
+ "128109": {
876
+ "content": "<|reserved_special_token_104|>",
877
+ "lstrip": false,
878
+ "normalized": false,
879
+ "rstrip": false,
880
+ "single_word": false,
881
+ "special": true
882
+ },
883
+ "128110": {
884
+ "content": "<|reserved_special_token_105|>",
885
+ "lstrip": false,
886
+ "normalized": false,
887
+ "rstrip": false,
888
+ "single_word": false,
889
+ "special": true
890
+ },
891
+ "128111": {
892
+ "content": "<|reserved_special_token_106|>",
893
+ "lstrip": false,
894
+ "normalized": false,
895
+ "rstrip": false,
896
+ "single_word": false,
897
+ "special": true
898
+ },
899
+ "128112": {
900
+ "content": "<|reserved_special_token_107|>",
901
+ "lstrip": false,
902
+ "normalized": false,
903
+ "rstrip": false,
904
+ "single_word": false,
905
+ "special": true
906
+ },
907
+ "128113": {
908
+ "content": "<|reserved_special_token_108|>",
909
+ "lstrip": false,
910
+ "normalized": false,
911
+ "rstrip": false,
912
+ "single_word": false,
913
+ "special": true
914
+ },
915
+ "128114": {
916
+ "content": "<|reserved_special_token_109|>",
917
+ "lstrip": false,
918
+ "normalized": false,
919
+ "rstrip": false,
920
+ "single_word": false,
921
+ "special": true
922
+ },
923
+ "128115": {
924
+ "content": "<|reserved_special_token_110|>",
925
+ "lstrip": false,
926
+ "normalized": false,
927
+ "rstrip": false,
928
+ "single_word": false,
929
+ "special": true
930
+ },
931
+ "128116": {
932
+ "content": "<|reserved_special_token_111|>",
933
+ "lstrip": false,
934
+ "normalized": false,
935
+ "rstrip": false,
936
+ "single_word": false,
937
+ "special": true
938
+ },
939
+ "128117": {
940
+ "content": "<|reserved_special_token_112|>",
941
+ "lstrip": false,
942
+ "normalized": false,
943
+ "rstrip": false,
944
+ "single_word": false,
945
+ "special": true
946
+ },
947
+ "128118": {
948
+ "content": "<|reserved_special_token_113|>",
949
+ "lstrip": false,
950
+ "normalized": false,
951
+ "rstrip": false,
952
+ "single_word": false,
953
+ "special": true
954
+ },
955
+ "128119": {
956
+ "content": "<|reserved_special_token_114|>",
957
+ "lstrip": false,
958
+ "normalized": false,
959
+ "rstrip": false,
960
+ "single_word": false,
961
+ "special": true
962
+ },
963
+ "128120": {
964
+ "content": "<|reserved_special_token_115|>",
965
+ "lstrip": false,
966
+ "normalized": false,
967
+ "rstrip": false,
968
+ "single_word": false,
969
+ "special": true
970
+ },
971
+ "128121": {
972
+ "content": "<|reserved_special_token_116|>",
973
+ "lstrip": false,
974
+ "normalized": false,
975
+ "rstrip": false,
976
+ "single_word": false,
977
+ "special": true
978
+ },
979
+ "128122": {
980
+ "content": "<|reserved_special_token_117|>",
981
+ "lstrip": false,
982
+ "normalized": false,
983
+ "rstrip": false,
984
+ "single_word": false,
985
+ "special": true
986
+ },
987
+ "128123": {
988
+ "content": "<|reserved_special_token_118|>",
989
+ "lstrip": false,
990
+ "normalized": false,
991
+ "rstrip": false,
992
+ "single_word": false,
993
+ "special": true
994
+ },
995
+ "128124": {
996
+ "content": "<|reserved_special_token_119|>",
997
+ "lstrip": false,
998
+ "normalized": false,
999
+ "rstrip": false,
1000
+ "single_word": false,
1001
+ "special": true
1002
+ },
1003
+ "128125": {
1004
+ "content": "<|reserved_special_token_120|>",
1005
+ "lstrip": false,
1006
+ "normalized": false,
1007
+ "rstrip": false,
1008
+ "single_word": false,
1009
+ "special": true
1010
+ },
1011
+ "128126": {
1012
+ "content": "<|reserved_special_token_121|>",
1013
+ "lstrip": false,
1014
+ "normalized": false,
1015
+ "rstrip": false,
1016
+ "single_word": false,
1017
+ "special": true
1018
+ },
1019
+ "128127": {
1020
+ "content": "<|reserved_special_token_122|>",
1021
+ "lstrip": false,
1022
+ "normalized": false,
1023
+ "rstrip": false,
1024
+ "single_word": false,
1025
+ "special": true
1026
+ },
1027
+ "128128": {
1028
+ "content": "<|reserved_special_token_123|>",
1029
+ "lstrip": false,
1030
+ "normalized": false,
1031
+ "rstrip": false,
1032
+ "single_word": false,
1033
+ "special": true
1034
+ },
1035
+ "128129": {
1036
+ "content": "<|reserved_special_token_124|>",
1037
+ "lstrip": false,
1038
+ "normalized": false,
1039
+ "rstrip": false,
1040
+ "single_word": false,
1041
+ "special": true
1042
+ },
1043
+ "128130": {
1044
+ "content": "<|reserved_special_token_125|>",
1045
+ "lstrip": false,
1046
+ "normalized": false,
1047
+ "rstrip": false,
1048
+ "single_word": false,
1049
+ "special": true
1050
+ },
1051
+ "128131": {
1052
+ "content": "<|reserved_special_token_126|>",
1053
+ "lstrip": false,
1054
+ "normalized": false,
1055
+ "rstrip": false,
1056
+ "single_word": false,
1057
+ "special": true
1058
+ },
1059
+ "128132": {
1060
+ "content": "<|reserved_special_token_127|>",
1061
+ "lstrip": false,
1062
+ "normalized": false,
1063
+ "rstrip": false,
1064
+ "single_word": false,
1065
+ "special": true
1066
+ },
1067
+ "128133": {
1068
+ "content": "<|reserved_special_token_128|>",
1069
+ "lstrip": false,
1070
+ "normalized": false,
1071
+ "rstrip": false,
1072
+ "single_word": false,
1073
+ "special": true
1074
+ },
1075
+ "128134": {
1076
+ "content": "<|reserved_special_token_129|>",
1077
+ "lstrip": false,
1078
+ "normalized": false,
1079
+ "rstrip": false,
1080
+ "single_word": false,
1081
+ "special": true
1082
+ },
1083
+ "128135": {
1084
+ "content": "<|reserved_special_token_130|>",
1085
+ "lstrip": false,
1086
+ "normalized": false,
1087
+ "rstrip": false,
1088
+ "single_word": false,
1089
+ "special": true
1090
+ },
1091
+ "128136": {
1092
+ "content": "<|reserved_special_token_131|>",
1093
+ "lstrip": false,
1094
+ "normalized": false,
1095
+ "rstrip": false,
1096
+ "single_word": false,
1097
+ "special": true
1098
+ },
1099
+ "128137": {
1100
+ "content": "<|reserved_special_token_132|>",
1101
+ "lstrip": false,
1102
+ "normalized": false,
1103
+ "rstrip": false,
1104
+ "single_word": false,
1105
+ "special": true
1106
+ },
1107
+ "128138": {
1108
+ "content": "<|reserved_special_token_133|>",
1109
+ "lstrip": false,
1110
+ "normalized": false,
1111
+ "rstrip": false,
1112
+ "single_word": false,
1113
+ "special": true
1114
+ },
1115
+ "128139": {
1116
+ "content": "<|reserved_special_token_134|>",
1117
+ "lstrip": false,
1118
+ "normalized": false,
1119
+ "rstrip": false,
1120
+ "single_word": false,
1121
+ "special": true
1122
+ },
1123
+ "128140": {
1124
+ "content": "<|reserved_special_token_135|>",
1125
+ "lstrip": false,
1126
+ "normalized": false,
1127
+ "rstrip": false,
1128
+ "single_word": false,
1129
+ "special": true
1130
+ },
1131
+ "128141": {
1132
+ "content": "<|reserved_special_token_136|>",
1133
+ "lstrip": false,
1134
+ "normalized": false,
1135
+ "rstrip": false,
1136
+ "single_word": false,
1137
+ "special": true
1138
+ },
1139
+ "128142": {
1140
+ "content": "<|reserved_special_token_137|>",
1141
+ "lstrip": false,
1142
+ "normalized": false,
1143
+ "rstrip": false,
1144
+ "single_word": false,
1145
+ "special": true
1146
+ },
1147
+ "128143": {
1148
+ "content": "<|reserved_special_token_138|>",
1149
+ "lstrip": false,
1150
+ "normalized": false,
1151
+ "rstrip": false,
1152
+ "single_word": false,
1153
+ "special": true
1154
+ },
1155
+ "128144": {
1156
+ "content": "<|reserved_special_token_139|>",
1157
+ "lstrip": false,
1158
+ "normalized": false,
1159
+ "rstrip": false,
1160
+ "single_word": false,
1161
+ "special": true
1162
+ },
1163
+ "128145": {
1164
+ "content": "<|reserved_special_token_140|>",
1165
+ "lstrip": false,
1166
+ "normalized": false,
1167
+ "rstrip": false,
1168
+ "single_word": false,
1169
+ "special": true
1170
+ },
1171
+ "128146": {
1172
+ "content": "<|reserved_special_token_141|>",
1173
+ "lstrip": false,
1174
+ "normalized": false,
1175
+ "rstrip": false,
1176
+ "single_word": false,
1177
+ "special": true
1178
+ },
1179
+ "128147": {
1180
+ "content": "<|reserved_special_token_142|>",
1181
+ "lstrip": false,
1182
+ "normalized": false,
1183
+ "rstrip": false,
1184
+ "single_word": false,
1185
+ "special": true
1186
+ },
1187
+ "128148": {
1188
+ "content": "<|reserved_special_token_143|>",
1189
+ "lstrip": false,
1190
+ "normalized": false,
1191
+ "rstrip": false,
1192
+ "single_word": false,
1193
+ "special": true
1194
+ },
1195
+ "128149": {
1196
+ "content": "<|reserved_special_token_144|>",
1197
+ "lstrip": false,
1198
+ "normalized": false,
1199
+ "rstrip": false,
1200
+ "single_word": false,
1201
+ "special": true
1202
+ },
1203
+ "128150": {
1204
+ "content": "<|reserved_special_token_145|>",
1205
+ "lstrip": false,
1206
+ "normalized": false,
1207
+ "rstrip": false,
1208
+ "single_word": false,
1209
+ "special": true
1210
+ },
1211
+ "128151": {
1212
+ "content": "<|reserved_special_token_146|>",
1213
+ "lstrip": false,
1214
+ "normalized": false,
1215
+ "rstrip": false,
1216
+ "single_word": false,
1217
+ "special": true
1218
+ },
1219
+ "128152": {
1220
+ "content": "<|reserved_special_token_147|>",
1221
+ "lstrip": false,
1222
+ "normalized": false,
1223
+ "rstrip": false,
1224
+ "single_word": false,
1225
+ "special": true
1226
+ },
1227
+ "128153": {
1228
+ "content": "<|reserved_special_token_148|>",
1229
+ "lstrip": false,
1230
+ "normalized": false,
1231
+ "rstrip": false,
1232
+ "single_word": false,
1233
+ "special": true
1234
+ },
1235
+ "128154": {
1236
+ "content": "<|reserved_special_token_149|>",
1237
+ "lstrip": false,
1238
+ "normalized": false,
1239
+ "rstrip": false,
1240
+ "single_word": false,
1241
+ "special": true
1242
+ },
1243
+ "128155": {
1244
+ "content": "<|reserved_special_token_150|>",
1245
+ "lstrip": false,
1246
+ "normalized": false,
1247
+ "rstrip": false,
1248
+ "single_word": false,
1249
+ "special": true
1250
+ },
1251
+ "128156": {
1252
+ "content": "<|reserved_special_token_151|>",
1253
+ "lstrip": false,
1254
+ "normalized": false,
1255
+ "rstrip": false,
1256
+ "single_word": false,
1257
+ "special": true
1258
+ },
1259
+ "128157": {
1260
+ "content": "<|reserved_special_token_152|>",
1261
+ "lstrip": false,
1262
+ "normalized": false,
1263
+ "rstrip": false,
1264
+ "single_word": false,
1265
+ "special": true
1266
+ },
1267
+ "128158": {
1268
+ "content": "<|reserved_special_token_153|>",
1269
+ "lstrip": false,
1270
+ "normalized": false,
1271
+ "rstrip": false,
1272
+ "single_word": false,
1273
+ "special": true
1274
+ },
1275
+ "128159": {
1276
+ "content": "<|reserved_special_token_154|>",
1277
+ "lstrip": false,
1278
+ "normalized": false,
1279
+ "rstrip": false,
1280
+ "single_word": false,
1281
+ "special": true
1282
+ },
1283
+ "128160": {
1284
+ "content": "<|reserved_special_token_155|>",
1285
+ "lstrip": false,
1286
+ "normalized": false,
1287
+ "rstrip": false,
1288
+ "single_word": false,
1289
+ "special": true
1290
+ },
1291
+ "128161": {
1292
+ "content": "<|reserved_special_token_156|>",
1293
+ "lstrip": false,
1294
+ "normalized": false,
1295
+ "rstrip": false,
1296
+ "single_word": false,
1297
+ "special": true
1298
+ },
1299
+ "128162": {
1300
+ "content": "<|reserved_special_token_157|>",
1301
+ "lstrip": false,
1302
+ "normalized": false,
1303
+ "rstrip": false,
1304
+ "single_word": false,
1305
+ "special": true
1306
+ },
1307
+ "128163": {
1308
+ "content": "<|reserved_special_token_158|>",
1309
+ "lstrip": false,
1310
+ "normalized": false,
1311
+ "rstrip": false,
1312
+ "single_word": false,
1313
+ "special": true
1314
+ },
1315
+ "128164": {
1316
+ "content": "<|reserved_special_token_159|>",
1317
+ "lstrip": false,
1318
+ "normalized": false,
1319
+ "rstrip": false,
1320
+ "single_word": false,
1321
+ "special": true
1322
+ },
1323
+ "128165": {
1324
+ "content": "<|reserved_special_token_160|>",
1325
+ "lstrip": false,
1326
+ "normalized": false,
1327
+ "rstrip": false,
1328
+ "single_word": false,
1329
+ "special": true
1330
+ },
1331
+ "128166": {
1332
+ "content": "<|reserved_special_token_161|>",
1333
+ "lstrip": false,
1334
+ "normalized": false,
1335
+ "rstrip": false,
1336
+ "single_word": false,
1337
+ "special": true
1338
+ },
1339
+ "128167": {
1340
+ "content": "<|reserved_special_token_162|>",
1341
+ "lstrip": false,
1342
+ "normalized": false,
1343
+ "rstrip": false,
1344
+ "single_word": false,
1345
+ "special": true
1346
+ },
1347
+ "128168": {
1348
+ "content": "<|reserved_special_token_163|>",
1349
+ "lstrip": false,
1350
+ "normalized": false,
1351
+ "rstrip": false,
1352
+ "single_word": false,
1353
+ "special": true
1354
+ },
1355
+ "128169": {
1356
+ "content": "<|reserved_special_token_164|>",
1357
+ "lstrip": false,
1358
+ "normalized": false,
1359
+ "rstrip": false,
1360
+ "single_word": false,
1361
+ "special": true
1362
+ },
1363
+ "128170": {
1364
+ "content": "<|reserved_special_token_165|>",
1365
+ "lstrip": false,
1366
+ "normalized": false,
1367
+ "rstrip": false,
1368
+ "single_word": false,
1369
+ "special": true
1370
+ },
1371
+ "128171": {
1372
+ "content": "<|reserved_special_token_166|>",
1373
+ "lstrip": false,
1374
+ "normalized": false,
1375
+ "rstrip": false,
1376
+ "single_word": false,
1377
+ "special": true
1378
+ },
1379
+ "128172": {
1380
+ "content": "<|reserved_special_token_167|>",
1381
+ "lstrip": false,
1382
+ "normalized": false,
1383
+ "rstrip": false,
1384
+ "single_word": false,
1385
+ "special": true
1386
+ },
1387
+ "128173": {
1388
+ "content": "<|reserved_special_token_168|>",
1389
+ "lstrip": false,
1390
+ "normalized": false,
1391
+ "rstrip": false,
1392
+ "single_word": false,
1393
+ "special": true
1394
+ },
1395
+ "128174": {
1396
+ "content": "<|reserved_special_token_169|>",
1397
+ "lstrip": false,
1398
+ "normalized": false,
1399
+ "rstrip": false,
1400
+ "single_word": false,
1401
+ "special": true
1402
+ },
1403
+ "128175": {
1404
+ "content": "<|reserved_special_token_170|>",
1405
+ "lstrip": false,
1406
+ "normalized": false,
1407
+ "rstrip": false,
1408
+ "single_word": false,
1409
+ "special": true
1410
+ },
1411
+ "128176": {
1412
+ "content": "<|reserved_special_token_171|>",
1413
+ "lstrip": false,
1414
+ "normalized": false,
1415
+ "rstrip": false,
1416
+ "single_word": false,
1417
+ "special": true
1418
+ },
1419
+ "128177": {
1420
+ "content": "<|reserved_special_token_172|>",
1421
+ "lstrip": false,
1422
+ "normalized": false,
1423
+ "rstrip": false,
1424
+ "single_word": false,
1425
+ "special": true
1426
+ },
1427
+ "128178": {
1428
+ "content": "<|reserved_special_token_173|>",
1429
+ "lstrip": false,
1430
+ "normalized": false,
1431
+ "rstrip": false,
1432
+ "single_word": false,
1433
+ "special": true
1434
+ },
1435
+ "128179": {
1436
+ "content": "<|reserved_special_token_174|>",
1437
+ "lstrip": false,
1438
+ "normalized": false,
1439
+ "rstrip": false,
1440
+ "single_word": false,
1441
+ "special": true
1442
+ },
1443
+ "128180": {
1444
+ "content": "<|reserved_special_token_175|>",
1445
+ "lstrip": false,
1446
+ "normalized": false,
1447
+ "rstrip": false,
1448
+ "single_word": false,
1449
+ "special": true
1450
+ },
1451
+ "128181": {
1452
+ "content": "<|reserved_special_token_176|>",
1453
+ "lstrip": false,
1454
+ "normalized": false,
1455
+ "rstrip": false,
1456
+ "single_word": false,
1457
+ "special": true
1458
+ },
1459
+ "128182": {
1460
+ "content": "<|reserved_special_token_177|>",
1461
+ "lstrip": false,
1462
+ "normalized": false,
1463
+ "rstrip": false,
1464
+ "single_word": false,
1465
+ "special": true
1466
+ },
1467
+ "128183": {
1468
+ "content": "<|reserved_special_token_178|>",
1469
+ "lstrip": false,
1470
+ "normalized": false,
1471
+ "rstrip": false,
1472
+ "single_word": false,
1473
+ "special": true
1474
+ },
1475
+ "128184": {
1476
+ "content": "<|reserved_special_token_179|>",
1477
+ "lstrip": false,
1478
+ "normalized": false,
1479
+ "rstrip": false,
1480
+ "single_word": false,
1481
+ "special": true
1482
+ },
1483
+ "128185": {
1484
+ "content": "<|reserved_special_token_180|>",
1485
+ "lstrip": false,
1486
+ "normalized": false,
1487
+ "rstrip": false,
1488
+ "single_word": false,
1489
+ "special": true
1490
+ },
1491
+ "128186": {
1492
+ "content": "<|reserved_special_token_181|>",
1493
+ "lstrip": false,
1494
+ "normalized": false,
1495
+ "rstrip": false,
1496
+ "single_word": false,
1497
+ "special": true
1498
+ },
1499
+ "128187": {
1500
+ "content": "<|reserved_special_token_182|>",
1501
+ "lstrip": false,
1502
+ "normalized": false,
1503
+ "rstrip": false,
1504
+ "single_word": false,
1505
+ "special": true
1506
+ },
1507
+ "128188": {
1508
+ "content": "<|reserved_special_token_183|>",
1509
+ "lstrip": false,
1510
+ "normalized": false,
1511
+ "rstrip": false,
1512
+ "single_word": false,
1513
+ "special": true
1514
+ },
1515
+ "128189": {
1516
+ "content": "<|reserved_special_token_184|>",
1517
+ "lstrip": false,
1518
+ "normalized": false,
1519
+ "rstrip": false,
1520
+ "single_word": false,
1521
+ "special": true
1522
+ },
1523
+ "128190": {
1524
+ "content": "<|reserved_special_token_185|>",
1525
+ "lstrip": false,
1526
+ "normalized": false,
1527
+ "rstrip": false,
1528
+ "single_word": false,
1529
+ "special": true
1530
+ },
1531
+ "128191": {
1532
+ "content": "<|reserved_special_token_186|>",
1533
+ "lstrip": false,
1534
+ "normalized": false,
1535
+ "rstrip": false,
1536
+ "single_word": false,
1537
+ "special": true
1538
+ },
1539
+ "128192": {
1540
+ "content": "<|reserved_special_token_187|>",
1541
+ "lstrip": false,
1542
+ "normalized": false,
1543
+ "rstrip": false,
1544
+ "single_word": false,
1545
+ "special": true
1546
+ },
1547
+ "128193": {
1548
+ "content": "<|reserved_special_token_188|>",
1549
+ "lstrip": false,
1550
+ "normalized": false,
1551
+ "rstrip": false,
1552
+ "single_word": false,
1553
+ "special": true
1554
+ },
1555
+ "128194": {
1556
+ "content": "<|reserved_special_token_189|>",
1557
+ "lstrip": false,
1558
+ "normalized": false,
1559
+ "rstrip": false,
1560
+ "single_word": false,
1561
+ "special": true
1562
+ },
1563
+ "128195": {
1564
+ "content": "<|reserved_special_token_190|>",
1565
+ "lstrip": false,
1566
+ "normalized": false,
1567
+ "rstrip": false,
1568
+ "single_word": false,
1569
+ "special": true
1570
+ },
1571
+ "128196": {
1572
+ "content": "<|reserved_special_token_191|>",
1573
+ "lstrip": false,
1574
+ "normalized": false,
1575
+ "rstrip": false,
1576
+ "single_word": false,
1577
+ "special": true
1578
+ },
1579
+ "128197": {
1580
+ "content": "<|reserved_special_token_192|>",
1581
+ "lstrip": false,
1582
+ "normalized": false,
1583
+ "rstrip": false,
1584
+ "single_word": false,
1585
+ "special": true
1586
+ },
1587
+ "128198": {
1588
+ "content": "<|reserved_special_token_193|>",
1589
+ "lstrip": false,
1590
+ "normalized": false,
1591
+ "rstrip": false,
1592
+ "single_word": false,
1593
+ "special": true
1594
+ },
1595
+ "128199": {
1596
+ "content": "<|reserved_special_token_194|>",
1597
+ "lstrip": false,
1598
+ "normalized": false,
1599
+ "rstrip": false,
1600
+ "single_word": false,
1601
+ "special": true
1602
+ },
1603
+ "128200": {
1604
+ "content": "<|reserved_special_token_195|>",
1605
+ "lstrip": false,
1606
+ "normalized": false,
1607
+ "rstrip": false,
1608
+ "single_word": false,
1609
+ "special": true
1610
+ },
1611
+ "128201": {
1612
+ "content": "<|reserved_special_token_196|>",
1613
+ "lstrip": false,
1614
+ "normalized": false,
1615
+ "rstrip": false,
1616
+ "single_word": false,
1617
+ "special": true
1618
+ },
1619
+ "128202": {
1620
+ "content": "<|reserved_special_token_197|>",
1621
+ "lstrip": false,
1622
+ "normalized": false,
1623
+ "rstrip": false,
1624
+ "single_word": false,
1625
+ "special": true
1626
+ },
1627
+ "128203": {
1628
+ "content": "<|reserved_special_token_198|>",
1629
+ "lstrip": false,
1630
+ "normalized": false,
1631
+ "rstrip": false,
1632
+ "single_word": false,
1633
+ "special": true
1634
+ },
1635
+ "128204": {
1636
+ "content": "<|reserved_special_token_199|>",
1637
+ "lstrip": false,
1638
+ "normalized": false,
1639
+ "rstrip": false,
1640
+ "single_word": false,
1641
+ "special": true
1642
+ },
1643
+ "128205": {
1644
+ "content": "<|reserved_special_token_200|>",
1645
+ "lstrip": false,
1646
+ "normalized": false,
1647
+ "rstrip": false,
1648
+ "single_word": false,
1649
+ "special": true
1650
+ },
1651
+ "128206": {
1652
+ "content": "<|reserved_special_token_201|>",
1653
+ "lstrip": false,
1654
+ "normalized": false,
1655
+ "rstrip": false,
1656
+ "single_word": false,
1657
+ "special": true
1658
+ },
1659
+ "128207": {
1660
+ "content": "<|reserved_special_token_202|>",
1661
+ "lstrip": false,
1662
+ "normalized": false,
1663
+ "rstrip": false,
1664
+ "single_word": false,
1665
+ "special": true
1666
+ },
1667
+ "128208": {
1668
+ "content": "<|reserved_special_token_203|>",
1669
+ "lstrip": false,
1670
+ "normalized": false,
1671
+ "rstrip": false,
1672
+ "single_word": false,
1673
+ "special": true
1674
+ },
1675
+ "128209": {
1676
+ "content": "<|reserved_special_token_204|>",
1677
+ "lstrip": false,
1678
+ "normalized": false,
1679
+ "rstrip": false,
1680
+ "single_word": false,
1681
+ "special": true
1682
+ },
1683
+ "128210": {
1684
+ "content": "<|reserved_special_token_205|>",
1685
+ "lstrip": false,
1686
+ "normalized": false,
1687
+ "rstrip": false,
1688
+ "single_word": false,
1689
+ "special": true
1690
+ },
1691
+ "128211": {
1692
+ "content": "<|reserved_special_token_206|>",
1693
+ "lstrip": false,
1694
+ "normalized": false,
1695
+ "rstrip": false,
1696
+ "single_word": false,
1697
+ "special": true
1698
+ },
1699
+ "128212": {
1700
+ "content": "<|reserved_special_token_207|>",
1701
+ "lstrip": false,
1702
+ "normalized": false,
1703
+ "rstrip": false,
1704
+ "single_word": false,
1705
+ "special": true
1706
+ },
1707
+ "128213": {
1708
+ "content": "<|reserved_special_token_208|>",
1709
+ "lstrip": false,
1710
+ "normalized": false,
1711
+ "rstrip": false,
1712
+ "single_word": false,
1713
+ "special": true
1714
+ },
1715
+ "128214": {
1716
+ "content": "<|reserved_special_token_209|>",
1717
+ "lstrip": false,
1718
+ "normalized": false,
1719
+ "rstrip": false,
1720
+ "single_word": false,
1721
+ "special": true
1722
+ },
1723
+ "128215": {
1724
+ "content": "<|reserved_special_token_210|>",
1725
+ "lstrip": false,
1726
+ "normalized": false,
1727
+ "rstrip": false,
1728
+ "single_word": false,
1729
+ "special": true
1730
+ },
1731
+ "128216": {
1732
+ "content": "<|reserved_special_token_211|>",
1733
+ "lstrip": false,
1734
+ "normalized": false,
1735
+ "rstrip": false,
1736
+ "single_word": false,
1737
+ "special": true
1738
+ },
1739
+ "128217": {
1740
+ "content": "<|reserved_special_token_212|>",
1741
+ "lstrip": false,
1742
+ "normalized": false,
1743
+ "rstrip": false,
1744
+ "single_word": false,
1745
+ "special": true
1746
+ },
1747
+ "128218": {
1748
+ "content": "<|reserved_special_token_213|>",
1749
+ "lstrip": false,
1750
+ "normalized": false,
1751
+ "rstrip": false,
1752
+ "single_word": false,
1753
+ "special": true
1754
+ },
1755
+ "128219": {
1756
+ "content": "<|reserved_special_token_214|>",
1757
+ "lstrip": false,
1758
+ "normalized": false,
1759
+ "rstrip": false,
1760
+ "single_word": false,
1761
+ "special": true
1762
+ },
1763
+ "128220": {
1764
+ "content": "<|reserved_special_token_215|>",
1765
+ "lstrip": false,
1766
+ "normalized": false,
1767
+ "rstrip": false,
1768
+ "single_word": false,
1769
+ "special": true
1770
+ },
1771
+ "128221": {
1772
+ "content": "<|reserved_special_token_216|>",
1773
+ "lstrip": false,
1774
+ "normalized": false,
1775
+ "rstrip": false,
1776
+ "single_word": false,
1777
+ "special": true
1778
+ },
1779
+ "128222": {
1780
+ "content": "<|reserved_special_token_217|>",
1781
+ "lstrip": false,
1782
+ "normalized": false,
1783
+ "rstrip": false,
1784
+ "single_word": false,
1785
+ "special": true
1786
+ },
1787
+ "128223": {
1788
+ "content": "<|reserved_special_token_218|>",
1789
+ "lstrip": false,
1790
+ "normalized": false,
1791
+ "rstrip": false,
1792
+ "single_word": false,
1793
+ "special": true
1794
+ },
1795
+ "128224": {
1796
+ "content": "<|reserved_special_token_219|>",
1797
+ "lstrip": false,
1798
+ "normalized": false,
1799
+ "rstrip": false,
1800
+ "single_word": false,
1801
+ "special": true
1802
+ },
1803
+ "128225": {
1804
+ "content": "<|reserved_special_token_220|>",
1805
+ "lstrip": false,
1806
+ "normalized": false,
1807
+ "rstrip": false,
1808
+ "single_word": false,
1809
+ "special": true
1810
+ },
1811
+ "128226": {
1812
+ "content": "<|reserved_special_token_221|>",
1813
+ "lstrip": false,
1814
+ "normalized": false,
1815
+ "rstrip": false,
1816
+ "single_word": false,
1817
+ "special": true
1818
+ },
1819
+ "128227": {
1820
+ "content": "<|reserved_special_token_222|>",
1821
+ "lstrip": false,
1822
+ "normalized": false,
1823
+ "rstrip": false,
1824
+ "single_word": false,
1825
+ "special": true
1826
+ },
1827
+ "128228": {
1828
+ "content": "<|reserved_special_token_223|>",
1829
+ "lstrip": false,
1830
+ "normalized": false,
1831
+ "rstrip": false,
1832
+ "single_word": false,
1833
+ "special": true
1834
+ },
1835
+ "128229": {
1836
+ "content": "<|reserved_special_token_224|>",
1837
+ "lstrip": false,
1838
+ "normalized": false,
1839
+ "rstrip": false,
1840
+ "single_word": false,
1841
+ "special": true
1842
+ },
1843
+ "128230": {
1844
+ "content": "<|reserved_special_token_225|>",
1845
+ "lstrip": false,
1846
+ "normalized": false,
1847
+ "rstrip": false,
1848
+ "single_word": false,
1849
+ "special": true
1850
+ },
1851
+ "128231": {
1852
+ "content": "<|reserved_special_token_226|>",
1853
+ "lstrip": false,
1854
+ "normalized": false,
1855
+ "rstrip": false,
1856
+ "single_word": false,
1857
+ "special": true
1858
+ },
1859
+ "128232": {
1860
+ "content": "<|reserved_special_token_227|>",
1861
+ "lstrip": false,
1862
+ "normalized": false,
1863
+ "rstrip": false,
1864
+ "single_word": false,
1865
+ "special": true
1866
+ },
1867
+ "128233": {
1868
+ "content": "<|reserved_special_token_228|>",
1869
+ "lstrip": false,
1870
+ "normalized": false,
1871
+ "rstrip": false,
1872
+ "single_word": false,
1873
+ "special": true
1874
+ },
1875
+ "128234": {
1876
+ "content": "<|reserved_special_token_229|>",
1877
+ "lstrip": false,
1878
+ "normalized": false,
1879
+ "rstrip": false,
1880
+ "single_word": false,
1881
+ "special": true
1882
+ },
1883
+ "128235": {
1884
+ "content": "<|reserved_special_token_230|>",
1885
+ "lstrip": false,
1886
+ "normalized": false,
1887
+ "rstrip": false,
1888
+ "single_word": false,
1889
+ "special": true
1890
+ },
1891
+ "128236": {
1892
+ "content": "<|reserved_special_token_231|>",
1893
+ "lstrip": false,
1894
+ "normalized": false,
1895
+ "rstrip": false,
1896
+ "single_word": false,
1897
+ "special": true
1898
+ },
1899
+ "128237": {
1900
+ "content": "<|reserved_special_token_232|>",
1901
+ "lstrip": false,
1902
+ "normalized": false,
1903
+ "rstrip": false,
1904
+ "single_word": false,
1905
+ "special": true
1906
+ },
1907
+ "128238": {
1908
+ "content": "<|reserved_special_token_233|>",
1909
+ "lstrip": false,
1910
+ "normalized": false,
1911
+ "rstrip": false,
1912
+ "single_word": false,
1913
+ "special": true
1914
+ },
1915
+ "128239": {
1916
+ "content": "<|reserved_special_token_234|>",
1917
+ "lstrip": false,
1918
+ "normalized": false,
1919
+ "rstrip": false,
1920
+ "single_word": false,
1921
+ "special": true
1922
+ },
1923
+ "128240": {
1924
+ "content": "<|reserved_special_token_235|>",
1925
+ "lstrip": false,
1926
+ "normalized": false,
1927
+ "rstrip": false,
1928
+ "single_word": false,
1929
+ "special": true
1930
+ },
1931
+ "128241": {
1932
+ "content": "<|reserved_special_token_236|>",
1933
+ "lstrip": false,
1934
+ "normalized": false,
1935
+ "rstrip": false,
1936
+ "single_word": false,
1937
+ "special": true
1938
+ },
1939
+ "128242": {
1940
+ "content": "<|reserved_special_token_237|>",
1941
+ "lstrip": false,
1942
+ "normalized": false,
1943
+ "rstrip": false,
1944
+ "single_word": false,
1945
+ "special": true
1946
+ },
1947
+ "128243": {
1948
+ "content": "<|reserved_special_token_238|>",
1949
+ "lstrip": false,
1950
+ "normalized": false,
1951
+ "rstrip": false,
1952
+ "single_word": false,
1953
+ "special": true
1954
+ },
1955
+ "128244": {
1956
+ "content": "<|reserved_special_token_239|>",
1957
+ "lstrip": false,
1958
+ "normalized": false,
1959
+ "rstrip": false,
1960
+ "single_word": false,
1961
+ "special": true
1962
+ },
1963
+ "128245": {
1964
+ "content": "<|reserved_special_token_240|>",
1965
+ "lstrip": false,
1966
+ "normalized": false,
1967
+ "rstrip": false,
1968
+ "single_word": false,
1969
+ "special": true
1970
+ },
1971
+ "128246": {
1972
+ "content": "<|reserved_special_token_241|>",
1973
+ "lstrip": false,
1974
+ "normalized": false,
1975
+ "rstrip": false,
1976
+ "single_word": false,
1977
+ "special": true
1978
+ },
1979
+ "128247": {
1980
+ "content": "<|reserved_special_token_242|>",
1981
+ "lstrip": false,
1982
+ "normalized": false,
1983
+ "rstrip": false,
1984
+ "single_word": false,
1985
+ "special": true
1986
+ },
1987
+ "128248": {
1988
+ "content": "<|reserved_special_token_243|>",
1989
+ "lstrip": false,
1990
+ "normalized": false,
1991
+ "rstrip": false,
1992
+ "single_word": false,
1993
+ "special": true
1994
+ },
1995
+ "128249": {
1996
+ "content": "<|reserved_special_token_244|>",
1997
+ "lstrip": false,
1998
+ "normalized": false,
1999
+ "rstrip": false,
2000
+ "single_word": false,
2001
+ "special": true
2002
+ },
2003
+ "128250": {
2004
+ "content": "<|reserved_special_token_245|>",
2005
+ "lstrip": false,
2006
+ "normalized": false,
2007
+ "rstrip": false,
2008
+ "single_word": false,
2009
+ "special": true
2010
+ },
2011
+ "128251": {
2012
+ "content": "<|reserved_special_token_246|>",
2013
+ "lstrip": false,
2014
+ "normalized": false,
2015
+ "rstrip": false,
2016
+ "single_word": false,
2017
+ "special": true
2018
+ },
2019
+ "128252": {
2020
+ "content": "<|reserved_special_token_247|>",
2021
+ "lstrip": false,
2022
+ "normalized": false,
2023
+ "rstrip": false,
2024
+ "single_word": false,
2025
+ "special": true
2026
+ },
2027
+ "128253": {
2028
+ "content": "<|reserved_special_token_248|>",
2029
+ "lstrip": false,
2030
+ "normalized": false,
2031
+ "rstrip": false,
2032
+ "single_word": false,
2033
+ "special": true
2034
+ },
2035
+ "128254": {
2036
+ "content": "<|reserved_special_token_249|>",
2037
+ "lstrip": false,
2038
+ "normalized": false,
2039
+ "rstrip": false,
2040
+ "single_word": false,
2041
+ "special": true
2042
+ },
2043
+ "128255": {
2044
+ "content": "<|reserved_special_token_250|>",
2045
+ "lstrip": false,
2046
+ "normalized": false,
2047
+ "rstrip": false,
2048
+ "single_word": false,
2049
+ "special": true
2050
+ },
2051
+ "128256": {
2052
+ "content": "<tool_response>",
2053
+ "lstrip": false,
2054
+ "normalized": false,
2055
+ "rstrip": false,
2056
+ "single_word": false,
2057
+ "special": false
2058
+ }
2059
+ },
2060
+ "bos_token": "<|begin_of_text|>",
2061
+ "chat_template": "{{bos_token}}{% for message in messages %}{{'<|im_start|>' + message['role'] + '\n' + message['content'] + '<|im_end|>' + '\n'}}{% endfor %}{% if add_generation_prompt %}{{ '<|im_start|>assistant\n' }}{% endif %}",
2062
+ "clean_up_tokenization_spaces": true,
2063
+ "eos_token": "<|im_end|>",
2064
+ "model_input_names": [
2065
+ "input_ids",
2066
+ "attention_mask"
2067
+ ],
2068
+ "model_max_length": 1000000000000000019884624838656,
2069
+ "pad_token": "<|end_of_text|>",
2070
+ "tokenizer_class": "PreTrainedTokenizerFast"
2071
+ }