octopus-planning / added_tokens.json
zackli4ai's picture
Upload tokenizer
47091d0 verified
raw
history blame
341 Bytes
{
"<nexa_end>": 32012,
"<nexa_split>": 32011,
"<|assistant|>": 32001,
"<|endoftext|>": 32000,
"<|end|>": 32007,
"<|placeholder1|>": 32002,
"<|placeholder2|>": 32003,
"<|placeholder3|>": 32004,
"<|placeholder4|>": 32005,
"<|placeholder5|>": 32008,
"<|placeholder6|>": 32009,
"<|system|>": 32006,
"<|user|>": 32010
}